Skip to content

Commit

Permalink
deploy: e3e1236
Browse files Browse the repository at this point in the history
  • Loading branch information
tianxuzhang committed Dec 3, 2023
1 parent fb96423 commit 2ae63a0
Show file tree
Hide file tree
Showing 3 changed files with 167 additions and 1 deletion.
93 changes: 93 additions & 0 deletions _sources/docs/回归/正则化线性回归.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -484,6 +484,99 @@
"print(\"模型系数:\")\n",
"print(multi_task_elastic_net.coef_)"
]
},
{
"cell_type": "markdown",
"id": "42f8c990",
"metadata": {},
"source": [
"## 最小角回归\n",
"最小角回归(Least Angle Regression,简称LARS)是一种用于线性回归和特征选择的迭代算法。它通过一系列步骤逐渐构建回归模型,并根据变量与目标变量之间的相关性来选择特征。\n",
"\n",
"LARS算法的主要思想是每次选择与目标变量具有最大相关性的特征,并沿着该方向移动。在每个步骤中,LARS将当前的解投影到残差向量上,然后确定一个新变量进入解集,并决定如何移动。\n",
"\n",
"LARS算法的步骤如下:\n",
"\n",
"* 初始化:将所有系数设为零,并计算初始残差。\n",
"* 选择特征:找到与当前残差具有最大相关性的特征,并将该特征添加到解集中。\n",
"* 移动步长:通过沿着这个特征的方向移动,使解向量与该特征的相关性等于其他已选择特征的相关性。\n",
"* 更新解和残差:更新解向量和残差向量,以反映新的解和新的步长。\n",
"* 重复步骤2-4,直到达到所需的特征数量或达到停止准则。\n",
"\n",
"LARS算法可以用于拟合线性回归模型并进行特征选择。它能够处理高维数据集,同时具有较低的计算复杂度。\n",
"\n",
"在Scikit-learn中,你可以使用Lars类来实现最小角回归。例如:"
]
},
{
"cell_type": "code",
"execution_count": 10,
"id": "6ed5133b",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"模型系数:\n",
"[ 0.02913523 -0.00313884 -0.03431681 0.29380121 -0.13901173 -0.08586778\n",
" 0.11000887 0.00837283 0.08400827 -0.11083981]\n"
]
}
],
"source": [
"from sklearn.linear_model import Lars\n",
"\n",
"# 创建Lars对象并进行拟合\n",
"lars = Lars()\n",
"lars.fit(X, y)\n",
"\n",
"# 输出模型系数\n",
"print(\"模型系数:\")\n",
"print(lars.coef_)"
]
},
{
"cell_type": "markdown",
"id": "36d1ba79",
"metadata": {},
"source": [
"### LARS Lasso\n",
"\n",
"LARS Lasso(Least Angle Regression Lasso)是一种结合了最小角回归(LARS)和Lasso回归的正则化方法。\n",
"\n",
"与传统的LASSO回归相比,LARS Lasso具有更高的计算效率,因为它使用了逐步向前的方式来确定特征的顺序,而不需要像坐标下降或拟梯度等方法那样进行迭代优化。\n",
"\n",
"在Scikit-learn中,你可以使用LassoLars类来实现LARS Lasso回归。以下是一个示例代码:"
]
},
{
"cell_type": "code",
"execution_count": 11,
"id": "a99de35e",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"模型系数:\n",
"[ 0. 0. 0. 0.21248063 -0.00992147 0.\n",
" 0.00512876 0. 0.00215299 -0.00265399]\n"
]
}
],
"source": [
"from sklearn.linear_model import LassoLars\n",
"\n",
"# 创建LassoLars对象并进行拟合\n",
"lasso_lars = LassoLars(alpha=0.1)\n",
"lasso_lars.fit(X, y)\n",
"\n",
"# 输出模型系数\n",
"print(\"模型系数:\")\n",
"print(lasso_lars.coef_)"
]
}
],
"metadata": {
Expand Down
73 changes: 73 additions & 0 deletions docs/回归/正则化线性回归.html
Original file line number Diff line number Diff line change
Expand Up @@ -385,6 +385,10 @@ <h2> Contents </h2>
<li class="toc-h3 nav-item toc-entry"><a class="reference internal nav-link" href="#id8">多任务弹性网络</a></li>
</ul>
</li>
<li class="toc-h2 nav-item toc-entry"><a class="reference internal nav-link" href="#id9">最小角回归</a><ul class="nav section-nav flex-column">
<li class="toc-h3 nav-item toc-entry"><a class="reference internal nav-link" href="#lars-lasso">LARS Lasso</a></li>
</ul>
</li>
</ul>
</nav>
</div>
Expand Down Expand Up @@ -745,6 +749,71 @@ <h3>多任务弹性网络<a class="headerlink" href="#id8" title="Permalink to t
</div>
</section>
</section>
<section id="id9">
<h2>最小角回归<a class="headerlink" href="#id9" title="Permalink to this heading">#</a></h2>
<p>最小角回归(Least Angle Regression,简称LARS)是一种用于线性回归和特征选择的迭代算法。它通过一系列步骤逐渐构建回归模型,并根据变量与目标变量之间的相关性来选择特征。</p>
<p>LARS算法的主要思想是每次选择与目标变量具有最大相关性的特征,并沿着该方向移动。在每个步骤中,LARS将当前的解投影到残差向量上,然后确定一个新变量进入解集,并决定如何移动。</p>
<p>LARS算法的步骤如下:</p>
<ul class="simple">
<li><p>初始化:将所有系数设为零,并计算初始残差。</p></li>
<li><p>选择特征:找到与当前残差具有最大相关性的特征,并将该特征添加到解集中。</p></li>
<li><p>移动步长:通过沿着这个特征的方向移动,使解向量与该特征的相关性等于其他已选择特征的相关性。</p></li>
<li><p>更新解和残差:更新解向量和残差向量,以反映新的解和新的步长。</p></li>
<li><p>重复步骤2-4,直到达到所需的特征数量或达到停止准则。</p></li>
</ul>
<p>LARS算法可以用于拟合线性回归模型并进行特征选择。它能够处理高维数据集,同时具有较低的计算复杂度。</p>
<p>在Scikit-learn中,你可以使用Lars类来实现最小角回归。例如:</p>
<div class="cell docutils container">
<div class="cell_input docutils container">
<div class="highlight-ipython3 notranslate"><div class="highlight"><pre><span></span><span class="kn">from</span> <span class="nn">sklearn.linear_model</span> <span class="kn">import</span> <span class="n">Lars</span>

<span class="c1"># 创建Lars对象并进行拟合</span>
<span class="n">lars</span> <span class="o">=</span> <span class="n">Lars</span><span class="p">()</span>
<span class="n">lars</span><span class="o">.</span><span class="n">fit</span><span class="p">(</span><span class="n">X</span><span class="p">,</span> <span class="n">y</span><span class="p">)</span>

<span class="c1"># 输出模型系数</span>
<span class="nb">print</span><span class="p">(</span><span class="s2">&quot;模型系数:&quot;</span><span class="p">)</span>
<span class="nb">print</span><span class="p">(</span><span class="n">lars</span><span class="o">.</span><span class="n">coef_</span><span class="p">)</span>
</pre></div>
</div>
</div>
<div class="cell_output docutils container">
<div class="output stream highlight-myst-ansi notranslate"><div class="highlight"><pre><span></span>模型系数:
[ 0.02913523 -0.00313884 -0.03431681 0.29380121 -0.13901173 -0.08586778
0.11000887 0.00837283 0.08400827 -0.11083981]
</pre></div>
</div>
</div>
</div>
<section id="lars-lasso">
<h3>LARS Lasso<a class="headerlink" href="#lars-lasso" title="Permalink to this heading">#</a></h3>
<p>LARS Lasso(Least Angle Regression Lasso)是一种结合了最小角回归(LARS)和Lasso回归的正则化方法。</p>
<p>与传统的LASSO回归相比,LARS Lasso具有更高的计算效率,因为它使用了逐步向前的方式来确定特征的顺序,而不需要像坐标下降或拟梯度等方法那样进行迭代优化。</p>
<p>在Scikit-learn中,你可以使用LassoLars类来实现LARS Lasso回归。以下是一个示例代码:</p>
<div class="cell docutils container">
<div class="cell_input docutils container">
<div class="highlight-ipython3 notranslate"><div class="highlight"><pre><span></span><span class="kn">from</span> <span class="nn">sklearn.linear_model</span> <span class="kn">import</span> <span class="n">LassoLars</span>

<span class="c1"># 创建LassoLars对象并进行拟合</span>
<span class="n">lasso_lars</span> <span class="o">=</span> <span class="n">LassoLars</span><span class="p">(</span><span class="n">alpha</span><span class="o">=</span><span class="mf">0.1</span><span class="p">)</span>
<span class="n">lasso_lars</span><span class="o">.</span><span class="n">fit</span><span class="p">(</span><span class="n">X</span><span class="p">,</span> <span class="n">y</span><span class="p">)</span>

<span class="c1"># 输出模型系数</span>
<span class="nb">print</span><span class="p">(</span><span class="s2">&quot;模型系数:&quot;</span><span class="p">)</span>
<span class="nb">print</span><span class="p">(</span><span class="n">lasso_lars</span><span class="o">.</span><span class="n">coef_</span><span class="p">)</span>
</pre></div>
</div>
</div>
<div class="cell_output docutils container">
<div class="output stream highlight-myst-ansi notranslate"><div class="highlight"><pre><span></span>模型系数:
[ 0. 0. 0. 0.21248063 -0.00992147 0.
0.00512876 0. 0.00215299 -0.00265399]
</pre></div>
</div>
</div>
</div>
</section>
</section>
</section>

<script type="text/x-thebe-config">
Expand Down Expand Up @@ -815,6 +884,10 @@ <h3>多任务弹性网络<a class="headerlink" href="#id8" title="Permalink to t
<li class="toc-h3 nav-item toc-entry"><a class="reference internal nav-link" href="#id8">多任务弹性网络</a></li>
</ul>
</li>
<li class="toc-h2 nav-item toc-entry"><a class="reference internal nav-link" href="#id9">最小角回归</a><ul class="nav section-nav flex-column">
<li class="toc-h3 nav-item toc-entry"><a class="reference internal nav-link" href="#lars-lasso">LARS Lasso</a></li>
</ul>
</li>
</ul>
</nav></div>

Expand Down
2 changes: 1 addition & 1 deletion searchindex.js

Large diffs are not rendered by default.

0 comments on commit 2ae63a0

Please sign in to comment.