Skip to content

Commit

Permalink
Browse files Browse the repository at this point in the history
  • Loading branch information
github-actions[bot] committed Jun 9, 2024
1 parent 55f5065 commit 08847da
Show file tree
Hide file tree
Showing 46 changed files with 2,192 additions and 440 deletions.
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
First Week into GSoC 2024: Building the AutoEncoder, writing the training loop
==============================================================================

.. post:: May 31 2024
:author: Iñigo Tellaetxe
:tags: google
:category: gsoc


What I did this week
~~~~~~~~~~~~~~~~~~~~
I finished becoming familiar with the TensorFlow + Keras basics and I wrote the training loop and a couple of scripts for instantiating and training the AutoEncoder.
Data loading was also addressed and I am able to load the data from the FiberCup dataset in .trk format using `NiBabel <https://nipy.org/nibabel/>`__, transform it into NumPy arrays, and feed it into the network.

What is coming up next week
~~~~~~~~~~~~~~~~~~~~~~~~~~~
Because the training loop is taking too long, I will refactor the code to make it more modular and more in the TensorFlow style. Also, other DIPY models are implemented in this fashion, what will contribute to consistency across the library.
I think I have yet to get the hang of the TensorFlow way of doing things.
I plan to use a class for the ``Encoder`` and another one for the ``Decoder``. Then I will bring them together under an ``AutoEncoder`` class that inherits from the Keras ``Model`` class.
This will allow me to use the ``fit`` method from the Keras ``Model`` class and make the training loop more efficient, together with easing its usage. I will just have all the relevant training parameters to the ``compile`` method, to later call the ``fit`` method to train the model, taking care of the weight updates more efficiently than my handmade training loop.

Did I get stuck anywhere
~~~~~~~~~~~~~~~~~~~~~~~~
Getting the handmade training loop up and running gave me a couple of headaches because the weight update was taking ages. I am still stuck here and that is why I will refactor the code next week.
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
First Week into GSoC 2024: Building the AutoEncoder, preliminary results
========================================================================
Second Week into GSoC 2024: Refactoring the AutoEncoder, preliminary results
============================================================================

.. post:: June 7 2024
:author: Iñigo Tellaetxe
Expand All @@ -9,7 +9,7 @@ First Week into GSoC 2024: Building the AutoEncoder, preliminary results

What I did this week
~~~~~~~~~~~~~~~~~~~~
This week I refactored the AutoEncoder code to match the design patterns and the organization of other Deep Learning models in the DIPY repo. I transferred my code to a `separate repo <https://github.com/itellaetxe/tractoencoder_gsoc>`_ to keep the DIPY repo clean and to experiment freely. Once the final product is working, I will merge it into DIPY. I also packaged the whole repo so I can use it as a library.
This week I refactored the AutoEncoder code to match the design patterns and the organization of other Deep Learning models in the DIPY repo; and to make the training loop more efficient and easy to use. I transferred my code to a `separate repo <https://github.com/itellaetxe/tractoencoder_gsoc>`_ to keep the DIPY repo clean and to experiment freely. Once the final product is working, I will merge it into DIPY. I also packaged the whole repo so I can use it as a library.
Training experiments were run for a maximum of a 150 epochs, with variable results. They are not amazing, but at least we get some reconstruction of the input tracts from FiberCup, which seems to be on the right track. I also implemented training logs that report the parameters I used for training, so I can reproduce the results at any time. This still needs work though, because not all parameters are stored. Need to polish!
The left image shows the input tracts, and the middle and right images show two reconstructions from two different training experiments.

Expand All @@ -25,5 +25,5 @@ This week should focus on trying to reproduce the small implementation differenc

Did I get stuck anywhere
~~~~~~~~~~~~~~~~~~~~~~~~
I got a bit stuck refactoring the code to match the DIPY design patterns and also with the TensorFlow implementation itself, because the output shape of the Encoder and the input shape of the Decoder were not matching.
After investigating what caused this issue, I discovered that `tf.shape` was not giving me the usual (and expected) shape of the tensor, conveniently stored in a `tuple`. I found this behavior strange, but I solved the problem just calling a `.shape` method on the Tensor, which does give me the shape tuple I needed.
I got a bit stuck refactoring the code to match the DIPY design patterns and also with the TensorFlow implementation itself, because the output shape of the ``Encoder`` and the input shape of the ``Decoder`` were not matching.
After investigating what caused this issue, I discovered that ``tf.shape`` was not giving me the usual (and expected) shape of the tensor, conveniently stored in a ``tuple``. I found this behavior strange, but I solved the problem just calling a ``.shape`` method on the Tensor, which does give me the shape tuple I needed.
80 changes: 77 additions & 3 deletions dipy.org/pull/39/blog.html
Original file line number Diff line number Diff line change
Expand Up @@ -811,7 +811,7 @@ <h1>

<div class="section ablog-post">
<h2 class="ablog-post-title">
<a href="posts/2024/2024_06_07_Inigo_week_1.html">First Week into GSoC 2024: Building the AutoEncoder, preliminary results</a>
<a href="posts/2024/2024_06_07_Inigo_week_2.html">Second Week into GSoC 2024: Refactoring the AutoEncoder, preliminary results</a>
</h2>
<ul class="ablog-archive">
<li>
Expand Down Expand Up @@ -875,12 +875,12 @@ <h2 class="ablog-post-title">

</div>
</ul>
<p class="ablog-post-excerpt"><p>This week I refactored the AutoEncoder code to match the design patterns and the organization of other Deep Learning models in the DIPY repo. I transferred my code to a <a class="reference external" href="https://github.com/itellaetxe/tractoencoder_gsoc">separate repo</a> to keep the DIPY repo clean and to experiment freely. Once the final product is working, I will merge it into DIPY. I also packaged the whole repo so I can use it as a library.
<p class="ablog-post-excerpt"><p>This week I refactored the AutoEncoder code to match the design patterns and the organization of other Deep Learning models in the DIPY repo; and to make the training loop more efficient and easy to use. I transferred my code to a <a class="reference external" href="https://github.com/itellaetxe/tractoencoder_gsoc">separate repo</a> to keep the DIPY repo clean and to experiment freely. Once the final product is working, I will merge it into DIPY. I also packaged the whole repo so I can use it as a library.
Training experiments were run for a maximum of a 150 epochs, with variable results. They are not amazing, but at least we get some reconstruction of the input tracts from FiberCup, which seems to be on the right track. I also implemented training logs that report the parameters I used for training, so I can reproduce the results at any time. This still needs work though, because not all parameters are stored. Need to polish!
The left image shows the input tracts, and the middle and right images show two reconstructions from two different training experiments.</p>
</p>

<p class="ablog-post-expand"><a href="posts/2024/2024_06_07_Inigo_week_1.html"><em>Read more ...</em></a></p>
<p class="ablog-post-expand"><a href="posts/2024/2024_06_07_Inigo_week_2.html"><em>Read more ...</em></a></p>
<hr/>
</div>

Expand Down Expand Up @@ -958,6 +958,80 @@ <h2 class="ablog-post-title">
<hr/>
</div>

<div class="section ablog-post">
<h2 class="ablog-post-title">
<a href="posts/2024/2024_05_31_Inigo_week_1.html">First Week into GSoC 2024: Building the AutoEncoder, writing the training loop</a>
</h2>
<ul class="ablog-archive">
<li>


<i class="fa fa-calendar"></i>

31 May 2024

</li>
<div class="ablog-sidebar-item ablog__postcard2">


<li id="ablog-sidebar-item author ablog__author">
<span>

<i class="fa-fw fa fa-user"></i>

</span>


<a href="blog/author/inigo-tellaetxe.html">Iñigo Tellaetxe</a>



</li>




<li id="ablog-sidebar-item category ablog__category">
<span>

<i class="fa-fw fa fa-folder-open"></i>

</span>


<a href="blog/category/gsoc.html">gsoc</a>



</li>


<li id="ablog-sidebar-item tags ablog__tags">
<span>


<i class="fa-fw fa fa-tag"></i>

</span>


<a href="blog/tag/google.html">google</a>



</li>


</div>
</ul>
<p class="ablog-post-excerpt"><p>I finished becoming familiar with the TensorFlow + Keras basics and I wrote the training loop and a couple of scripts for instantiating and training the AutoEncoder.
Data loading was also addressed and I am able to load the data from the FiberCup dataset in .trk format using <a class="reference external" href="https://nipy.org/nibabel/">NiBabel</a>, transform it into NumPy arrays, and feed it into the network.</p>
</p>

<p class="ablog-post-expand"><a href="posts/2024/2024_05_31_Inigo_week_1.html"><em>Read more ...</em></a></p>
<hr/>
</div>

<div class="section ablog-post">
<h2 class="ablog-post-title">
<a href="posts/2024/2024_05_27_kaustav_week0.html">My Journey Begins: Community Bonding Period with DIPY</a>
Expand Down
20 changes: 10 additions & 10 deletions dipy.org/pull/39/blog/2023.html
Original file line number Diff line number Diff line change
Expand Up @@ -759,8 +759,8 @@ <h3>


<li>
<a href="../posts/2024/2024_06_07_Inigo_week_1.html">
07 June - First Week into GSoC 2024: Building the AutoEncoder, preliminary results
<a href="../posts/2024/2024_06_07_Inigo_week_2.html">
07 June - Second Week into GSoC 2024: Refactoring the AutoEncoder, preliminary results
</a>
</li>

Expand All @@ -771,20 +771,20 @@ <h3>
</li>

<li>
<a href="../posts/2024/2024_05_27_kaustav_week0.html">
27 May - My Journey Begins: Community Bonding Period with DIPY
<a href="../posts/2024/2024_05_31_Inigo_week_1.html">
31 May - First Week into GSoC 2024: Building the AutoEncoder, writing the training loop
</a>
</li>

<li>
<a href="../posts/2024/2024_05_27_Inigo_week_0.html">
27 May - Community Bonding Period Summary and first impressions
<a href="../posts/2024/2024_05_27_kaustav_week0.html">
27 May - My Journey Begins: Community Bonding Period with DIPY
</a>
</li>

<li>
<a href="../posts/2023/2023_08_28_Shilpi_Week14.html">
28 August - Doing Final Touch-Ups: Week 14
<a href="../posts/2024/2024_05_27_Inigo_week_0.html">
27 May - Community Bonding Period Summary and first impressions
</a>
</li>

Expand Down Expand Up @@ -814,7 +814,7 @@ <h3>


<li>
<a href="category/gsoc.html">gsoc (27)</a>
<a href="category/gsoc.html">gsoc (28)</a>
</li>


Expand All @@ -830,7 +830,7 @@ <h3>


<li>
<a href="2024.html">2024 (4)</a>
<a href="2024.html">2024 (5)</a>
</li>


Expand Down
100 changes: 87 additions & 13 deletions dipy.org/pull/39/blog/2024.html
Original file line number Diff line number Diff line change
Expand Up @@ -759,8 +759,8 @@ <h3>


<li>
<a href="../posts/2024/2024_06_07_Inigo_week_1.html">
07 June - First Week into GSoC 2024: Building the AutoEncoder, preliminary results
<a href="../posts/2024/2024_06_07_Inigo_week_2.html">
07 June - Second Week into GSoC 2024: Refactoring the AutoEncoder, preliminary results
</a>
</li>

Expand All @@ -771,20 +771,20 @@ <h3>
</li>

<li>
<a href="../posts/2024/2024_05_27_kaustav_week0.html">
27 May - My Journey Begins: Community Bonding Period with DIPY
<a href="../posts/2024/2024_05_31_Inigo_week_1.html">
31 May - First Week into GSoC 2024: Building the AutoEncoder, writing the training loop
</a>
</li>

<li>
<a href="../posts/2024/2024_05_27_Inigo_week_0.html">
27 May - Community Bonding Period Summary and first impressions
<a href="../posts/2024/2024_05_27_kaustav_week0.html">
27 May - My Journey Begins: Community Bonding Period with DIPY
</a>
</li>

<li>
<a href="../posts/2023/2023_08_28_Shilpi_Week14.html">
28 August - Doing Final Touch-Ups: Week 14
<a href="../posts/2024/2024_05_27_Inigo_week_0.html">
27 May - Community Bonding Period Summary and first impressions
</a>
</li>

Expand Down Expand Up @@ -814,7 +814,7 @@ <h3>


<li>
<a href="category/gsoc.html">gsoc (27)</a>
<a href="category/gsoc.html">gsoc (28)</a>
</li>


Expand All @@ -830,7 +830,7 @@ <h3>


<li>
<a href="#">2024 (4)</a>
<a href="#">2024 (5)</a>
</li>


Expand Down Expand Up @@ -907,7 +907,7 @@ <h1>

<div class="section ablog-post">
<h2 class="ablog-post-title">
<a href="../posts/2024/2024_06_07_Inigo_week_1.html">First Week into GSoC 2024: Building the AutoEncoder, preliminary results</a>
<a href="../posts/2024/2024_06_07_Inigo_week_2.html">Second Week into GSoC 2024: Refactoring the AutoEncoder, preliminary results</a>
</h2>
<ul class="ablog-archive">
<li>
Expand Down Expand Up @@ -971,12 +971,12 @@ <h2 class="ablog-post-title">

</div>
</ul>
<p class="ablog-post-excerpt"><p>This week I refactored the AutoEncoder code to match the design patterns and the organization of other Deep Learning models in the DIPY repo. I transferred my code to a <a class="reference external" href="https://github.com/itellaetxe/tractoencoder_gsoc">separate repo</a> to keep the DIPY repo clean and to experiment freely. Once the final product is working, I will merge it into DIPY. I also packaged the whole repo so I can use it as a library.
<p class="ablog-post-excerpt"><p>This week I refactored the AutoEncoder code to match the design patterns and the organization of other Deep Learning models in the DIPY repo; and to make the training loop more efficient and easy to use. I transferred my code to a <a class="reference external" href="https://github.com/itellaetxe/tractoencoder_gsoc">separate repo</a> to keep the DIPY repo clean and to experiment freely. Once the final product is working, I will merge it into DIPY. I also packaged the whole repo so I can use it as a library.
Training experiments were run for a maximum of a 150 epochs, with variable results. They are not amazing, but at least we get some reconstruction of the input tracts from FiberCup, which seems to be on the right track. I also implemented training logs that report the parameters I used for training, so I can reproduce the results at any time. This still needs work though, because not all parameters are stored. Need to polish!
The left image shows the input tracts, and the middle and right images show two reconstructions from two different training experiments.</p>
</p>

<p class="ablog-post-expand"><a href="../posts/2024/2024_06_07_Inigo_week_1.html"><em>Read more ...</em></a></p>
<p class="ablog-post-expand"><a href="../posts/2024/2024_06_07_Inigo_week_2.html"><em>Read more ...</em></a></p>
<hr/>
</div>

Expand Down Expand Up @@ -1054,6 +1054,80 @@ <h2 class="ablog-post-title">
<hr/>
</div>

<div class="section ablog-post">
<h2 class="ablog-post-title">
<a href="../posts/2024/2024_05_31_Inigo_week_1.html">First Week into GSoC 2024: Building the AutoEncoder, writing the training loop</a>
</h2>
<ul class="ablog-archive">
<li>


<i class="fa fa-calendar"></i>

31 May 2024

</li>
<div class="ablog-sidebar-item ablog__postcard2">


<li id="ablog-sidebar-item author ablog__author">
<span>

<i class="fa-fw fa fa-user"></i>

</span>


<a href="author/inigo-tellaetxe.html">Iñigo Tellaetxe</a>



</li>




<li id="ablog-sidebar-item category ablog__category">
<span>

<i class="fa-fw fa fa-folder-open"></i>

</span>


<a href="category/gsoc.html">gsoc</a>



</li>


<li id="ablog-sidebar-item tags ablog__tags">
<span>


<i class="fa-fw fa fa-tag"></i>

</span>


<a href="tag/google.html">google</a>



</li>


</div>
</ul>
<p class="ablog-post-excerpt"><p>I finished becoming familiar with the TensorFlow + Keras basics and I wrote the training loop and a couple of scripts for instantiating and training the AutoEncoder.
Data loading was also addressed and I am able to load the data from the FiberCup dataset in .trk format using <a class="reference external" href="https://nipy.org/nibabel/">NiBabel</a>, transform it into NumPy arrays, and feed it into the network.</p>
</p>

<p class="ablog-post-expand"><a href="../posts/2024/2024_05_31_Inigo_week_1.html"><em>Read more ...</em></a></p>
<hr/>
</div>

<div class="section ablog-post">
<h2 class="ablog-post-title">
<a href="../posts/2024/2024_05_27_kaustav_week0.html">My Journey Begins: Community Bonding Period with DIPY</a>
Expand Down
Loading

0 comments on commit 08847da

Please sign in to comment.