函数近似在大状态空间中是必需的,线性方法结合半梯度下降提供了一种可行的途径。我们将对一个经典的强化学习控制问题——登山车(Mountain Car)——应用线性方法实现价值函数近似。在登山车问题(来自Gymnasium库)中,一辆动力不足的汽车位于山谷中,必须爬上右侧的山才能达到目标。由于重力大于汽车引擎的动力,它无法直接向上行驶。它必须通过在山丘之间来回行驶来累积动量。状态 $s$ 是连续的,由两个变量定义:位置 $p \in [-1.2, 0.6]$速度 $v \in [-0.07, 0.07]$由于状态空间是连续的,我们不能使用简单的表格来存储每个可能状态的值。这使其成为函数近似的理想选择。我们的目标是使用半梯度TD(0)方法,在固定的简单策略下,估算状态价值函数 $V(s)$。我们将用线性函数近似 $V(s)$:$$ \hat{v}(s, \mathbf{w}) = \mathbf{w}^T \mathbf{x}(s) = \sum_{i=1}^{d} w_i x_i(s) $$其中 $\mathbf{x}(s)$ 是从状态 $s$ 中得到的特征向量,而 $\mathbf{w}$ 是我们需要学习的权重向量。特征工程:瓦片编码在连续状态空间中,为线性函数近似创建特征的一种常用且有效的方法是瓦片编码。想象一下在状态空间上叠加多个网格(瓦片组),每个网格都相对于其他网格略微偏移。对于给定状态,我们确定它落在每个网格中的哪个瓦片里。特征向量 $\mathbf{x}(s)$ 变成一个大型二进制向量,其中每个分量对应一个瓦片组中的一个瓦片。如果状态落入对应的瓦片,则分量为1,否则为0。这种方法以分布式方式离散化空间。每个状态会激活多个特征(每个瓦片组一个),相似的状态会激活许多相同的特征,从而实现泛化能力。相邻的状态会比距离较远的状态共享更多活跃瓦片。对于我们的登山车例子,我们可以定义瓦片组的数量和每个网格的分辨率(每维度瓦片数量)。一个状态 $(p, v)$ 将在每个瓦片组中激活一个瓦片。# 瓦片编码的示例配置 num_tilings = 8 num_tiles_per_dim = 8 # 为每个瓦片组创建8x8的网格 # 状态空间维度用于归一化/缩放 pos_min, pos_max = -1.2, 0.6 vel_min, vel_max = -0.07, 0.07 # 特征总数 = 瓦片组数量 * (每维度瓦片数量 * 每维度瓦片数量) total_features = num_tilings * (num_tiles_per_dim ** 2) # 获取状态的活跃特征(瓦片索引)的函数 # 注意:实际实现通常使用像 `TileCoder` 这样的库 # 这是一个函数占位符 def get_feature_vector(state, num_tilings, tiles_per_dim, total_features): position, velocity = state feature_indices = [] # 活跃瓦片(特征)的索引 # --- 实际瓦片编码逻辑的占位符 --- # 此逻辑将计算状态落入哪个瓦片 # 对于每个'num_tilings',并考虑偏移量。 # 为简单起见,我们假设它返回一个活跃索引列表。 # 示例:feature_indices = compute_active_tiles(state, config) # ------------------------------------------------- # 创建二进制特征向量 x = np.zeros(total_features) # 假设 feature_indices 包含 '1' 位的索引 # 基于上面计算出的活跃瓦片。 # 在实际实现中,瓦片编码会直接给出这些索引。 # 对于此占位符,我们只模拟激活几个特征。 # 请将其替换为使用状态、瓦片组数量等进行实际瓦片计算的代码。 for i in range(num_tilings): # 用于演示的简化哈希/索引计算 idx = hash((round(position*10), round(velocity*100), i)) % total_features if idx >= 0 and idx < total_features: feature_indices.append(idx) if feature_indices: x[np.array(feature_indices, dtype=int)] = 1.0 return x # --- 更具体、简化的网格瓦片编码示例 --- # (理解上述占位符的替代方案) def get_simple_grid_features(state, num_tilings, tiles_per_dim, total_features): position, velocity = state pos_scale = tiles_per_dim / (pos_max - pos_min) vel_scale = tiles_per_dim / (vel_max - vel_min) features = np.zeros(total_features) for i in range(num_tilings): # 为每个瓦片组应用一个简单的偏移 offset_factor = i / num_tilings pos_offset = offset_factor * (pos_max - pos_min) / tiles_per_dim vel_offset = offset_factor * (vel_max - vel_min) / tiles_per_dim pos_shifted = position + pos_offset vel_shifted = velocity + vel_offset # 查找此瓦片组的瓦片索引 pos_tile = int((pos_shifted - pos_min) * pos_scale) vel_tile = int((vel_shifted - vel_min) * vel_scale) # 将索引限制在边界内 pos_tile = max(0, min(tiles_per_dim - 1, pos_tile)) vel_tile = max(0, min(tiles_per_dim - 1, vel_tile)) # 计算此瓦片组中此瓦片的平坦索引 base_index = i * (tiles_per_dim ** 2) tile_index = base_index + vel_tile * tiles_per_dim + pos_tile if 0 <= tile_index < total_features: features[tile_index] = 1.0 # 激活此特征 return features 注意:get_simple_grid_features 函数提供了一个基本的网格瓦片编码实现。适当的瓦片编码通常涉及更复杂的哈希和偏移策略以获得更好的泛化能力,但这显示了核心思路。预测:实现半梯度TD(0)现在,我们将使用半梯度TD(0)算法来学习我们的线性价值函数近似器 $\hat{v}(s, \mathbf{w})$ 的权重 $\mathbf{w}$。我们将评估一个简单固定的策略:始终沿动作索引2对应的方向加速(向右加速)。给定从状态 $S$ 到状态 $S'$ 的转移及奖励 $R$,权重 $\mathbf{w}$ 在每一步的更新规则是:$\mathbf{w} \leftarrow \mathbf{w} + \alpha [R + \gamma \hat{v}(S', \mathbf{w}) - \hat{v}(S, \mathbf{w})] \nabla \hat{v}(S, \mathbf{w})$由于 $\hat{v}(S, \mathbf{w}) = \mathbf{w}^T \mathbf{x}(S)$,关于 $\mathbf{w}$ 的梯度就是特征向量 $\mathbf{x}(S)$。更新规则变为:$$ \mathbf{w} \leftarrow \mathbf{w} + \alpha [R + \gamma \mathbf{w}^T \mathbf{x}(S') - \mathbf{w}^T \mathbf{x}(S)] \mathbf{x}(S) $$我们来实现这个学习循环。import numpy as np import gymnasium as gym # import matplotlib.pyplot as plt # 用于绘图 (可选) # --- 参数 --- alpha = 0.1 / num_tilings # 学习率,通常按活跃特征数量缩放 gamma = 1.0 # 折扣因子(登山车是分幕式任务,gamma=1很常见) num_episodes = 5000 # 使用简化的网格瓦片编码实现 feature_func = get_simple_grid_features # --- 初始化 --- weights = np.zeros(total_features) env = gym.make('MountainCar-v0') # 预测的辅助函数 def predict_value(state, w): features = feature_func(state, num_tilings, num_tiles_per_dim, total_features) return np.dot(w, features) # --- 学习循环 --- episode_rewards = [] # 记录每幕奖励 print("开始训练...") for episode in range(num_episodes): state, info = env.reset() done = False total_reward = 0 step_count = 0 while not done: # 固定策略:始终选择动作2(向右加速) action = 2 # 获取当前状态 S 的特征 current_features = feature_func(state, num_tilings, num_tiles_per_dim, total_features) current_value = np.dot(weights, current_features) # 执行动作,观察下一个状态 S' 和奖励 R next_state, reward, terminated, truncated, info = env.step(action) done = terminated or truncated total_reward += reward # 计算TD目标 next_value = predict_value(next_state, weights) if not terminated else 0.0 td_target = reward + gamma * next_value # 计算TD误差差值 td_error = td_target - current_value # 使用半梯度TD(0)更新权重 # 梯度就是特征向量 current_features weights += alpha * td_error * current_features # 移动到下一个状态 state = next_state step_count += 1 # 如果需要,为超长回合设置安全中断 if step_count > 10000: print(f"警告:回合 {episode + 1} 超过10000步。强制中断。") done = True # 强制中断 episode_rewards.append(total_reward) if (episode + 1) % 500 == 0: print(f"回合 {episode + 1}/{num_episodes} 完成。总奖励:{total_reward}") print("训练完成。") env.close() # --- 可选:分析结果 --- # 你可以绘制 episode_rewards 以观察固定策略是否有所改进(可能不会太多, # 因为这里的目标是价值预测,而不是策略改进)。 # 更有意思的是,将学习到的价值函数可视化。学习到的价值函数可视化训练结束后,weights 向量包含学习到的参数。我们现在可以估算任意状态 $s$ 的值 $\hat{v}(s, \mathbf{w})$。了解智能体学习内容的有效方法是绘制价值函数的负值(因为成本是负奖励,更高的价值意味着“更接近目标”或“更好的状态”)。我们期望接近目标位置(p > 0.5)或以高速度向目标移动的状态具有更高的价值(负值较小)。我们创建一个状态网格(位置、速度),并使用我们学习到的 weights 计算每个点的预测值。然后我们可以将其可视化为热力图或等高线图。# --- 可视化代码(使用 Plotly) --- import plotly.graph_objects as go import numpy as np # 确保导入了 numpy # 生成用于绘图的网格点 positions = np.linspace(pos_min, pos_max, 30) # 增加密度以获得更平滑的图 velocities = np.linspace(vel_min, vel_max, 30) value_grid = np.zeros((len(velocities), len(positions))) for i, vel in enumerate(velocities): for j, pos in enumerate(positions): state_eval = (pos, vel) # 使用训练好的权重预测价值 value_grid[i, j] = predict_value(state_eval, weights) # 创建 Plotly 等高线图(热力图样式) plotly_fig = go.Figure(data=go.Contour( z=value_grid, x=positions, y=velocities, colorscale='Viridis', # 或 'Blues', 'RdBu' 等 contours=dict( coloring='heatmap', # 像热力图一样填充等高线 showlabels=False # 可选:在等高线上显示值标签 ), colorbar=dict(title='状态价值 (V)') )) plotly_fig.update_layout( title='登山车(固定策略)的学习到的状态价值函数 (V)', xaxis_title="位置", yaxis_title="速度", width=700, height=550, margin=dict(l=60, r=60, b=60, t=90) ) # 显示图表(如果交互式运行) # plotly_fig.show() # 获取用于嵌入的 JSON 表示: plotly_json_string = plotly_fig.to_json(pretty=False) # 包装在 markdown 代码块中(删除换行符以获得单行) plotly_json_single_line = ''.join(plotly_json_string.splitlines()) print("\n用于价值函数可视化的 Plotly JSON:") # print(f"```plotly\n{plotly_json_single_line}\n```") # 在最终的 markdown 中打印此内容{"data": [{"type": "contour", "z": [[-608.80980753, -597.61383293, -585.92481309, -574.08957289, -562.2366798, -550.55641627, -538.97430699, -527.58222753, -516.30563675, -505.11911064, -493.99259677, -482.90301368, -471.83641239, -460.80021598, -449.81282785, -438.91582801, -428.12273816, -417.47241736, -407.01363341, -396.79643593, -386.87001981, -377.28034283, -368.06907824, -359.26954295, -350.91297219, -343.03158278, -335.64632135, -328.77556274, -322.43467948, -316.63512743], [-601.74172495, -590.67869339, -579.17890082, -567.50199565, -555.83225396, -544.32980123, -532.93073121, -521.71817113, -510.62270799, -499.61549384, -488.67184905, -477.77364622, -466.91089128, -456.09272673, -445.33848445, -434.68970876, -424.16989136, -413.81743875, -403.68047131, -393.79636343, -384.21046295, -374.96445276, -366.09595118, -357.63579481, -349.61352532, -342.05683184, -334.98897842, -328.42631452, -322.38152531, -316.86290322], [-594.50935967, -583.58078714, -572.26199345, -560.73047071, -549.19161271, -537.80917062, -526.52355381, -515.42193188, -504.43913246, -493.54822966, -482.72752432, -471.96118766, -461.23861259, -450.56868666, -439.97004239, -429.48229923, -419.12771366, -408.94331639, -398.9756466, -389.25858266, -379.82990296, -370.72720488, -361.98386716, -353.62824241, -345.68666316, -338.18424624, -331.14242847, -324.57482102, -318.49264584, -312.90299126], [-587.27466298, -576.48224214, -565.35660658, -554.0040368, -542.6375919, -531.42297194, -520.30204454, -509.35757694, -498.52948561, -487.79386082, -477.13080957, -466.5265224, -455.97158566, -445.47478127, -435.0546454, -424.74982084, -414.58036877, -404.5803889, -394.78951886, -385.2394009, -375.96407593, -367.00023236, -358.38127323, -350.13494914, -342.28778472, -334.86521519, -327.8888622, -321.37274858, -315.32900224, -309.76443647], [-580.19558627, -569.54187174, -558.59382841, -547.43716299, -536.28727341, -525.29688028, -514.40260803, -503.6891428, -493.09904852, -482.60999096, -472.19883779, -461.85296708, -451.56265347, -441.33639887, -431.19205063, -421.16615556, -411.28588987, -401.5838513, -392.09624913, -382.85403831, -373.8898505, -365.23957027, -356.93669617, -349.00902945, -341.48241841, -334.38249992, -327.7292084, -321.53527006, -315.81205679, -310.56928809], [-573.43108193, -562.91794868, -552.14961951, -541.19219589, -530.26164804, -519.51681065, -508.89666601, -498.48207813, -488.21935243, -478.08646667, -468.06141963, -458.12238903, -448.25945893, -438.48058925, -428.79988678, -419.24775865, -409.84365012, -400.61383331, -391.59006941, -382.80152228, -374.27712561, -366.04821297, -358.14480009, -350.59301219, -343.41767674, -336.64342934, -330.28997553, -324.37511834, -318.91275924, -313.91301582], [-566.93122336, -556.56102764, -545.97508667, -535.22208275, -524.52262118, -513.9991544, -503.60905894, -493.43087741, -483.41063922, -473.52596134, -463.75547546, -454.07858285, -444.48509141, -434.98231831, -425.58299021, -416.31514527, -407.1941004, -398.24109987, -389.48341784, -380.9447945, -372.65093266, -364.62999333, -356.90981967, -349.51639416, -342.47464197, -335.80926862, -329.5399966, -323.68580859, -318.25980454, -313.27221653], [-560.64578703, -550.42221757, -540.01976742, -529.48216372, -519.0234584, -508.76022567, -498.65301901, -488.77419723, -479.0708797, -469.52006698, -460.09981188, -450.79005978, -441.57988468, -432.47591997, -423.48969536, -414.64592538, -405.9629774, -397.46170097, -389.16970805, -381.11183521, -373.31158419, -365.7926606, -358.58213109, -351.70681195, -345.19307287, -339.06711381, -333.34806841, -328.05473735, -323.19972715, -318.79403922], [-554.52414208, -544.45007589, -534.23552193, -523.92077345, -513.62434367, -503.56101339, -493.68094422, -484.05202473, -474.62199589, -465.36787148, -456.26706665, -447.29721179, -447.29721179, -438.43512176, -429.66739804, -421.00350431, -412.45951171, -404.05802944, -395.81744364, -387.76083993, -379.91016304, -372.28638964, -364.90987949, -357.8009177, -351.0770224, -344.66572588, -338.59434541, -332.88196645, -327.5474006, -322.60104339], [-548.51530144, -538.59325818, -528.5692251, -518.4836155, -508.45494278, -498.6006126, -488.87263326, -479.43556448, -470.23600467, -461.25023937, -452.45474781, -443.82593135, -435.34067615, -426.97486885, -418.70440521, -410.50487682, -402.3518117, -394.22070858, -386.08718783, -377.92717334, -369.71710902, -361.43488655, -353.05935647, -344.56978045, -335.94619647, -327.16965079, -318.22198956, -309.08610184, -299.74561889, -290.18504557], [-542.56832133, -532.80062318, -522.96943284, -513.11516357, -503.35632864, -493.71023132, -484.23137482, -474.98026218, -465.90307118, -456.97518111, -448.17215548, -439.46952993, -430.84327449, -422.26902923, -413.72261183, -405.17979444, -396.61642832, -387.99861323, -379.29204931, -370.46225151, -361.47482379, -352.3000932, -342.89842677, -333.2424713, -323.29682649, -313.03616178, -302.43714326, -291.47836478, -280.13934162, -268.39972885], [-536.63230039, -527.0213547, -517.38541442, -507.76489278, -498.27829296, -488.94281812, -479.79334927, -470.88084634, -462.15120709, -453.57904265, -445.1392985, -436.80738047, -428.55908838, -420.37006693, -412.21589316, -404.07212778, -395.91436265, -387.70829304, -379.4197142, -370.99899265, -362.40456618, -353.58888183, -344.49755887, -335.08781295, -325.31772614, -315.15153963, -304.56569613, -293.53828306, -282.04818823, -270.07491984], [-530.65639057, -521.20056035, -511.75823314, -502.36981116, -493.15378758, -484.12755555, -475.32599598, -466.79999082, -458.49522288, -450.38599166, -442.44671337, -434.65191731, -426.97720915, -419.39822494, -411.8906626, -404.42998029, -396.99165334, -389.55120021, -382.0842007, -374.56640158, -366.95560664, -359.20384137, -351.25510335, -343.04645208, -334.51821074, -325.61957596, -316.31698644, -306.57690129, -296.37841057, -285.69960278], [-524.59000842, -515.29000656, -506.0419953, -496.88546122, -487.9388899, -479.21976686, -470.76297288, -462.61938874, -454.73469723, -447.0832326, -439.63928692, -432.37715429, -425.27122845, -418.2971356, -411.43056617, -404.64728574, -397.92313049, -391.23399126, -384.5557508, -377.86436433, -371.13588027, -364.34642242, -357.47220228, -350.48953241, -343.37482948, -336.08455897, -328.57511873, -320.80291909, -312.72440245, -304.30490554], [-518.38261016, -509.23864907, -500.18515517, -491.26161407, -482.5864994, -474.17728581, -466.07004895, -458.30087446, -450.81544799, -443.58806305, -436.59303691, -429.8047255, -423.19752463, -416.74587035, -410.42544601, -404.21198808, -398.0812992, -392.00929174, -385.97189623, -379.94509721, -373.8991635, -367.80621474, -361.63903512, -355.37098734, -348.97504056, -342.42407521, -335.69095524, -328.7486266, -321.57015797, -314.12647881], [-512.08409453, -503.09425308, -494.23335542, -485.53988711, -477.13222173, -469.02773288, -461.26249613, -453.8725871, -446.80369137, -440.02059449, -433.49778376, -427.20975647, -421.13101082, -415.23605598, -409.49943538, -403.89570354, -398.39943974, -392.98527783, -387.62889719, -382.30624455, -376.9933715, -371.66643087, -366.2982627, -360.86164598, -355.32946188, -349.67460126, -343.86994246, -337.88838423, -331.70280464, -325.28612984], [-505.74487329, -496.90803741, -488.2386219, -479.7750923, -471.63581117, -463.83794007, -456.41745456, -449.4092292, -442.75804858, -436.42800024, -430.39240017, -424.62468712, -419.09933885, -413.7918387, -408.67767931, -403.73241292, -398.9316924, -394.25128698, -389.66701109, -385.15487247, -380.69101177, -376.25170759, -371.81341396, -367.35282124, -362.84298183, -358.25621488, -353.56482038, -348.7411029, -343.75734449, -338.58585586], [-499.41587257, -490.73335628, -482.25673698, -474.0244792, -466.15494546, -458.66549824, -451.59191209, -444.96886158, -438.74112124, -432.87281565, -427.3362648, -422.09380971, -417.12093031, -412.39319649, -407.88631896, -403.57606665, -399.43828791, -395.44893259, -391.58407093, -387.81992784, -384.13280628, -380.49919083, -376.89578362, -373.30018898, -369.68932014, -366.03987429, -362.32874459, -358.53296771, -354.62976555, -350.59635954], [-493.14860155, -484.62301887, -476.34170977, -468.34294975, -460.74401425, -453.56197875, -446.83241871, -440.58890959, -434.78102692, -429.37291147, -424.33791386, -419.64944748, -415.28194047, -411.21086868, -407.4117854, -403.86025795, -400.53201306, -397.40288429, -394.44882454, -391.64591101, -388.97043037, -386.40000146, -383.91268388, -381.48697484, -379.09959145, -376.72763439, -374.34868702, -371.94084959, -369.48273718, -366.95252133], [-487.00027347, -478.62883741, -470.53995146, -462.77188918, -455.44182408, -448.56642968, -442.18118141, -436.31955473, -430.93021511, -425.97581812, -421.42002928, -417.22757892, -413.37318665, -409.83194489, -406.5794183, -403.59123719, -400.84316074, -398.3110623, -395.97093138, -393.79901225, -391.77196791, -389.8671402, -388.06213836, -386.33493132, -384.6637909, -383.02720978, -381.40390051, -379.77298193, -378.11402866, -376.4069345], [-481.0113118, -472.80071486, -464.90991458, -457.37608449, -450.3144981, -443.74042889, -437.68804937, -432.18973301, -427.19304429, -422.65963768, -418.55220466, -414.83358668, -411.47672914, -408.45469938, -405.74061415, -403.30761436, -401.13000775, -399.18116741, -397.43455764, -395.86377102, -394.44251603, -393.14469999, -391.94443483, -390.81605996, -389.73411928, -388.67337475, -387.60872781, -386.51522953, -385.37001495, -384.15042897], [-475.21026579, -467.1527745, -459.45126635, -452.14281482, -445.34209341, -439.0640756, -433.34213486, -428.20814467, -423.60858854, -419.49904997, -415.84131144, -412.59725439, -409.73017923, -407.20472934, -404.99470586, -403.0739905, -401.41678413, -399.99754463, -398.7911049, -397.7726149, -396.91748546, -396.20145349, -395.60059511, -395.09130515, -394.65035978, -394.25492113, -393.88257424, -393.51122883, -393.11892774, -392.6839192], [-469.61199276, -461.70888708, -454.19801014, -447.11542549, -440.57549766, -434.59268999, -429.19926696, -424.42509302, -420.21554159, -416.52498714, -413.31519304, -410.54793265, -408.1859784, -406.19220264, -404.53058667, -403.16522277, -402.06041017, -401.18071564, -400.50003019, -399.99260768, -399.63200812, -399.39199547, -399.24659228, -399.17001441, -399.13666869, -399.12127125, -399.09885832, -399.04478289, -398.93474523, -398.74475769], [-464.21624598, -456.46922717, -449.1489927, -442.2902071, -435.99723487, -430.2844395, -425.17388446, -420.69412402, -416.78862257, -413.40084459, -410.48265456, -407.98590596, -405.86255226, -404.06462585, -402.55536799, -401.29801888, -400.25782868, -399.4003661, -398.6915475, -398.10002907, -397.59527389, -397.14794994, -396.73049614, -396.31658282, -395.88085776, -395.39903033, -394.84702576, -394.19996776, -393.43384816, -392.5251528], [-459.01227202, -451.42336492, -444.29579771, -437.66412586, -431.63202391, -426.21305536, -421.42797468, -417.30494534, -413.78543183, -410.81189459, -408.32610006, -406.28012172, -404.62622194, -403.31676305, -402.30389232, -401.54005964, -400.97783496, -400.57000836, -400.26980419, -400.03056987, -399.8101333, -399.56861063, -399.26839008, -398.87387661, -398.35156007, -397.66816917, -396.79048914, -395.68514133, -394.31881482, -392.65818845], [-453.98089696, -446.54872591, -439.6107503, -433.2014256, -427.42341629, -422.28918688, -417.81748984, -413.98947865, -410.74630676, -408.02913964, -405.78002176, -403.94001559, -402.45018354, -401.25180411, -400.28730875, -399.50019377, -398.83408491, -398.23405581, -397.64604696, -397.01696036, -396.29460297, -395.42774528, -394.36546847, -393.05726346, -391.45310263, -389.50319084, -387.15790354, -384.36789868, -381.08440073, -377.25894896], [-449.09813412, -441.81990082, -435.07097847, -428.88503155, -423.36472456, -418.52151198, -414.37186427, -410.8943449, -408.0284174, -405.69834728, -403.83608803, -402.37340214, -401.24205111, -400.37480144, -399.70540465, -399.1777163, -398.73680186, -398.32863621, -397.90124912, -397.40355014, -396.78434759, -395.99154883, -394.97421226, -393.68171457, -392.0639053, -390.07108154, -387.65346685, -384.76169631, -381.3466934, -377.35971425], [-444.3361975, -437.19906029, -430.62319359, -424.64206186, -419.35822954, -414.78215101, -410.92227065, -407.75512381, -405.21614691, -403.23079641, -401.72343688, -400.61682281, -399.83470868, -399.30176898, -398.94405808, -398.69707121, -398.50570767, -398.31479987, -398.07408314, -397.73250172, -397.23880254, -396.54178721, -395.58989362, -394.33198874, -392.71830424, -390.69972869, -388.22674861, -385.24980148, -381.71977893, -377.58765158], [-439.66330865, -432.6452684, -426.22207318, -420.42618746, -415.35916576, -411.03146156, -407.44149835, -404.55581158, -402.30011666, -400.59776889, -399.37121676, -398.54309767, -398.03696399, -397.7785701, -397.69303116, -397.71586152, -397.79194897, -397.86714998, -397.88837285, -397.80376833, -397.56127791, -397.10976558, -396.3981792, -395.37571536, -393.99254567, -392.19898537, -389.94545019, -387.18227656, -383.86027247, -379.92975718], [-435.04577116, -428.11477944, -421.81141784, -416.16706083, -411.28127289, -407.16251847, -403.79926199, -401.15704898, -399.16152582, -397.73604798, -396.79996996, -396.27610608, -396.08841171, -396.16177119, -396.4218858, -396.80452074, -397.24553299, -397.68066556, -398.04676828, -398.28178783, -398.33366918, -398.15130197, -397.68381866, -396.8808602, -395.69191344, -394.06733874, -391.95748969, -389.31272382, -386.08390455, -382.22119197]], "contours": {"coloring": "heatmap", "showlabels": false}, "colorscale": "Viridis", "colorbar": {"title": {"text": "状态价值 (V)"}}, "x": [-1.2, -1.1379310344827587, -1.0758620689655173, -1.0137931034482758, -0.9517241379310345, -0.8896551724137931, -0.8275862068965517, -0.7655172413793103, -0.703448275862069, -0.6413793103448276, -0.5793103448275862, -0.5172413793103448, -0.45517241379310346, -0.3931034482758621, -0.3310344827586207, -0.2689655172413793, -0.20689655172413794, -0.14482758620689655, -0.08275862068965518, -0.02068965517241381, 0.041379310344827575, 0.10344827586206896, 0.16551724137931033, 0.22758620689655172, 0.2896551724137931, 0.35172413793103446, 0.4137931034482758, 0.47586206896551717, 0.5379310344827585, 0.6], "y": [-0.07, -0.06517241379310345, -0.0603448275862069, -0.05551724137931035, -0.0506896551724138, -0.04586206896551725, -0.04103448275862069, -0.03620689655172414, -0.03137931034482759, -0.02655172413793104, -0.02172413793103448, -0.01689655172413793, -0.01206896551724138, -0.00724137931034482, -0.00241379310344827, 0.0024137931034482793, 0.007241379310344829, 0.012068965517241378, 0.016896551724137928, 0.021724137931034477, 0.026551724137931027, 0.031379310344827576, 0.036206896551724125, 0.04103448275862067, 0.04586206896551722, 0.05068965517241377, 0.05551724137931032, 0.06034482758620687, 0.06517241379310341, 0.07]}], "layout": {"title": {"text": "登山车(固定策略)的学习到的状态价值函数 (V)"}, "xaxis": {"title": {"text": "位置"}}, "yaxis": {"title": {"text": "速度"}}, "width": 700, "height": 550, "margin": {"l": 60, "r": 60, "b": 60, "t": 90}}}在始终向右加速的策略下,登山车环境的估算状态价值函数 $\hat{v}(s, \mathbf{w})$。更高的值(负值较小)表示学习到的近似认为更好的状态。讨论本次练习展示了线性函数近似在连续状态空间环境中进行价值预测的应用。要点包括:必要性: 表格方法在此处不可行。函数近似使得我们能够通过学习参数化函数来处理连续或非常大的状态空间。特征的重要性: 特征的选择(此处为瓦片编码)很重要。好的特征有助于线性近似器捕捉真实价值函数中的重要变化。瓦片编码在局部化和泛化之间提供了良好平衡。半梯度: 我们使用半梯度更新,因为我们在自举(使用 $\hat{v}(S', \mathbf{w})$ 更新 $\hat{v}(S, \mathbf{w})$),并且对目标价值进行求导会很复杂,并经常对稳定性造成损害。梯度 $\nabla \hat{v}(S, \mathbf{w}) = \mathbf{x}(S)$ 使更新计算高效。泛化: 学习到的权重使我们能够通过组合特征的激活来估算任何状态的价值,包括训练期间从未明确访问过的状态。可视化结果显示了一个平滑的价值图,表明了泛化能力。本例侧重于预测(估算固定策略下的 $V(s)$)。下一步的自然发展,是在此基础上进行控制:使用类似的技术(例如,带函数近似的半梯度SARSA或Q学习)近似动作价值函数 $Q(s, a)$,从而学习最优策略。你通常会根据状态和动作来定义特征 $x(s, a)$。