Skip to content

Commit

Permalink
update documentation
Browse files Browse the repository at this point in the history
  • Loading branch information
IHIaadj committed Jun 2, 2023
1 parent fea0918 commit 4018d05
Show file tree
Hide file tree
Showing 6 changed files with 154 additions and 2 deletions.
Binary file modified docs/_build/doctrees/environment.pickle
Binary file not shown.
Binary file modified docs/_build/doctrees/getting_started.doctree
Binary file not shown.
47 changes: 47 additions & 0 deletions docs/_build/html/_sources/getting_started.rst.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,50 @@
Tutorial
========

*AnalogAINAS* is a framework that aims at building analog-aware efficient deep learning models. AnalogNAS is built on top of the [AIHWKIT](https://github.com/IBM/aihwkit). IBM Analog Hardware Acceleration Kit (AIHWKIT) is an open source Python toolkit for exploring and using the capabilities of in-memory computing devices in the context of artificial intelligence.

In a high-level AnalogAINAS consists of 4 main building blocks which (can) interact with each other:

* Configuration spaces: a search space of architectures targeting a specific dataset.
* Evaluator: a ML predictor model to predict:
* 1-day Accuracy: the evaluator models the drift effect that is encountered in Analog devices. The accuracy after 1 day of drift is then predicted and used as an objective to maximize.
* The Accuracy Variation for One Month (AVM): The difference between the accuracy after 1 month and the accuracy after 1 sec.
* The 1-day accuracy standard deviation: The stochasticity of the noise induces different variation of the model's accuracy depending on its architecture.
* Optimizer: a optimization strategy such as evolutionary algorithm or bayesian optimization.
* Worker: A global object that runs the architecture search loop and the final network training pipeline

Create a Configuration Space
----------------------------

AnalogNAS presents a general search space composed of ResNet-like architectures.

The macro-architecture defined in the file ```search_spaces/resnet_macro_architecture.py``` is customizable to any image classification dataset, given an input shape and output classes.

.. warning::
The hyperparameters in the configuration space should have a unique name ID each.

Evaluator
---------

To speed up the search, we built a machine learning predictor to evaluate the accuracy and robustness of any given architecture from the configuration space.

Search Optimizer and Worker
---------------------------

In this example, we will use evolutionary search to look for the best architecture in CS using our evaluator.

::

from analogainas.search_algorithms.ea_optimized import EAOptimizer
from analogainas.search_algorithms.worker import Worker

optimizer = EAOptimizer(evaluator, population_size=20, nb_iter=10)

NB_RUN = 2
worker = Worker(CS, optimizer=optimizer, runs=NB_RUN)

worker.search()

worker.result_summary()


60 changes: 59 additions & 1 deletion docs/_build/html/getting_started.html
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@
<script src="_static/js/theme.js"></script>
<link rel="index" title="Index" href="genindex.html" />
<link rel="search" title="Search" href="search.html" />
<link rel="next" title="API Reference" href="api_references.html" />
<link rel="prev" title="Installation" href="install.html" />
</head>

Expand All @@ -44,7 +45,17 @@
<p class="caption" role="heading"><span class="caption-text">Get started</span></p>
<ul class="current">
<li class="toctree-l1"><a class="reference internal" href="install.html">Installation</a></li>
<li class="toctree-l1 current"><a class="current reference internal" href="#">Tutorial</a></li>
<li class="toctree-l1 current"><a class="current reference internal" href="#">Tutorial</a><ul>
<li class="toctree-l2"><a class="reference internal" href="#create-a-configuration-space">Create a Configuration Space</a></li>
<li class="toctree-l2"><a class="reference internal" href="#evaluator">Evaluator</a></li>
<li class="toctree-l2"><a class="reference internal" href="#search-optimizer-and-worker">Search Optimizer and Worker</a></li>
</ul>
</li>
</ul>
<p class="caption" role="heading"><span class="caption-text">References</span></p>
<ul>
<li class="toctree-l1"><a class="reference internal" href="api_references.html">API Reference</a></li>
<li class="toctree-l1"><a class="reference internal" href="references.html">Paper References</a></li>
</ul>

</div>
Expand Down Expand Up @@ -73,13 +84,60 @@

<section id="tutorial">
<h1>Tutorial<a class="headerlink" href="#tutorial" title="Permalink to this heading"></a></h1>
<p><em>AnalogAINAS</em> is a framework that aims at building analog-aware efficient deep learning models. AnalogNAS is built on top of the [AIHWKIT](<a class="reference external" href="https://github.com/IBM/aihwkit">https://github.com/IBM/aihwkit</a>). IBM Analog Hardware Acceleration Kit (AIHWKIT) is an open source Python toolkit for exploring and using the capabilities of in-memory computing devices in the context of artificial intelligence.</p>
<p>In a high-level AnalogAINAS consists of 4 main building blocks which (can) interact with each other:</p>
<ul class="simple">
<li><p>Configuration spaces: a search space of architectures targeting a specific dataset.</p></li>
<li><dl class="simple">
<dt>Evaluator: a ML predictor model to predict:</dt><dd><ul>
<li><p>1-day Accuracy: the evaluator models the drift effect that is encountered in Analog devices. The accuracy after 1 day of drift is then predicted and used as an objective to maximize.</p></li>
<li><p>The Accuracy Variation for One Month (AVM): The difference between the accuracy after 1 month and the accuracy after 1 sec.</p></li>
<li><p>The 1-day accuracy standard deviation: The stochasticity of the noise induces different variation of the model’s accuracy depending on its architecture.</p></li>
</ul>
</dd>
</dl>
</li>
<li><p>Optimizer: a optimization strategy such as evolutionary algorithm or bayesian optimization.</p></li>
<li><p>Worker: A global object that runs the architecture search loop and the final network training pipeline</p></li>
</ul>
<section id="create-a-configuration-space">
<h2>Create a Configuration Space<a class="headerlink" href="#create-a-configuration-space" title="Permalink to this heading"></a></h2>
<p>AnalogNAS presents a general search space composed of ResNet-like architectures.</p>
<p>The macro-architecture defined in the file <code class="docutils literal notranslate"><span class="pre">`search_spaces/resnet_macro_architecture.py`</span></code> is customizable to any image classification dataset, given an input shape and output classes.</p>
<div class="admonition warning">
<p class="admonition-title">Warning</p>
<p>The hyperparameters in the configuration space should have a unique name ID each.</p>
</div>
</section>
<section id="evaluator">
<h2>Evaluator<a class="headerlink" href="#evaluator" title="Permalink to this heading"></a></h2>
<p>To speed up the search, we built a machine learning predictor to evaluate the accuracy and robustness of any given architecture from the configuration space.</p>
</section>
<section id="search-optimizer-and-worker">
<h2>Search Optimizer and Worker<a class="headerlink" href="#search-optimizer-and-worker" title="Permalink to this heading"></a></h2>
<p>In this example, we will use evolutionary search to look for the best architecture in CS using our evaluator.</p>
<div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="kn">from</span> <span class="nn">analogainas.search_algorithms.ea_optimized</span> <span class="kn">import</span> <span class="n">EAOptimizer</span>
<span class="kn">from</span> <span class="nn">analogainas.search_algorithms.worker</span> <span class="kn">import</span> <span class="n">Worker</span>

<span class="n">optimizer</span> <span class="o">=</span> <span class="n">EAOptimizer</span><span class="p">(</span><span class="n">evaluator</span><span class="p">,</span> <span class="n">population_size</span><span class="o">=</span><span class="mi">20</span><span class="p">,</span> <span class="n">nb_iter</span><span class="o">=</span><span class="mi">10</span><span class="p">)</span>

<span class="n">NB_RUN</span> <span class="o">=</span> <span class="mi">2</span>
<span class="n">worker</span> <span class="o">=</span> <span class="n">Worker</span><span class="p">(</span><span class="n">CS</span><span class="p">,</span> <span class="n">optimizer</span><span class="o">=</span><span class="n">optimizer</span><span class="p">,</span> <span class="n">runs</span><span class="o">=</span><span class="n">NB_RUN</span><span class="p">)</span>

<span class="n">worker</span><span class="o">.</span><span class="n">search</span><span class="p">()</span>

<span class="n">worker</span><span class="o">.</span><span class="n">result_summary</span><span class="p">()</span>
</pre></div>
</div>
</section>
</section>


</div>
</div>
<footer><div class="rst-footer-buttons" role="navigation" aria-label="Footer">
<a href="install.html" class="btn btn-neutral float-left" title="Installation" accesskey="p" rel="prev"><span class="fa fa-arrow-circle-left" aria-hidden="true"></span> Previous</a>
<a href="api_references.html" class="btn btn-neutral float-right" title="API Reference" accesskey="n" rel="next">Next <span class="fa fa-arrow-circle-right" aria-hidden="true"></span></a>
</div>

<hr/>
Expand Down
Loading

0 comments on commit 4018d05

Please sign in to comment.