diff --git a/docs/_build/doctrees/index.doctree b/docs/_build/doctrees/index.doctree index 0a9404a..35611fe 100644 Binary files a/docs/_build/doctrees/index.doctree and b/docs/_build/doctrees/index.doctree differ diff --git a/docs/_build/html/.buildinfo b/docs/_build/html/.buildinfo index 0e4950c..ab3c793 100644 --- a/docs/_build/html/.buildinfo +++ b/docs/_build/html/.buildinfo @@ -1,4 +1,4 @@ # Sphinx build info version 1 # This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done. -config: 0296ca83c7de98e5a331c0e059428bb3 +config: bdf77414e9e9ce716ef91ad58ab30f92 tags: 645f666f9bcd5a90fca523b33c5a78b7 diff --git a/docs/_build/html/_modules/index.html b/docs/_build/html/_modules/index.html index 8f30d44..ce2bfd5 100644 --- a/docs/_build/html/_modules/index.html +++ b/docs/_build/html/_modules/index.html @@ -5,7 +5,7 @@ - Overview: module code — pyCNN-LSTM documentation + Overview: module code — TSFEDL documentation + @@ -49,161 +50,35 @@

Keras Blocks -pyCNN_LSTM.blocks.densenet_transition_block(x, …) -A transition block of densenet for 1D data. +TSFEDL.blocks.densenet_transition_block + -pyCNN_LSTM.blocks.densenet_conv_block(x, …) -A building block for a dense block from densenet for 1D data. +TSFEDL.blocks.densenet_conv_block + -pyCNN_LSTM.blocks.densenet_dense_block(x, …) -A dense block of densenet for 1D data. +TSFEDL.blocks.densenet_dense_block + -pyCNN_LSTM.blocks.squeeze_excitation_module(x, …) -Squeeze-and-Excitation Module. +TSFEDL.blocks.squeeze_excitation_module + -pyCNN_LSTM.blocks.conv_block_YiboGao(in_x, …) -Convolutional block of YiboGao’s model +TSFEDL.blocks.conv_block_YiboGao + -pyCNN_LSTM.blocks.attention_branch_YiboGao(…) -Attention bronch of YiboGao’s model +TSFEDL.blocks.attention_branch_YiboGao + -pyCNN_LSTM.blocks.RTA_block(in_x, nb_filter, …) -Residual-based Temporal Attention (RTA) block. +TSFEDL.blocks.RTA_block + -pyCNN_LSTM.blocks.spatial_attention_block_ZhangJin(…) -Spatial attention module of ZhangJin’s model +TSFEDL.blocks.spatial_attention_block_ZhangJin + -pyCNN_LSTM.blocks.temporal_attention_block_ZhangJin(x) -Temporal attention module of ZhangJin’s Model. +TSFEDL.blocks.temporal_attention_block_ZhangJin + -
-
-pyCNN_LSTM.blocks.densenet_transition_block(x, reduction, name)[source]
-

A transition block of densenet for 1D data.

- --- - - - - - -
Parameters:
    -
  • x – input tensor.
  • -
  • reduction – float, compression rate at transition layers.
  • -
  • name – string, block label.
  • -
-
Returns:

output tensor for the block.

-
-
- -
-
-pyCNN_LSTM.blocks.densenet_conv_block(x, growth_rate, name)[source]
-

A building block for a dense block from densenet for 1D data.

- --- - - - - - -
Parameters:
    -
  • x – input tensor.
  • -
  • growth_rate – float, growth rate at dense layers.
  • -
  • name – string, block label.
  • -
-
Returns:

Output tensor for the block.

-
-
- -
-
-pyCNN_LSTM.blocks.densenet_dense_block(x, blocks, growth_rate, name)[source]
-

A dense block of densenet for 1D data.

- --- - - - - - -
Parameters:
    -
  • x – input tensor.
  • -
  • blocks – integer, the number of building blocks.
  • -
  • name – string, block label.
  • -
-
Returns:

Output tensor for the block.

-
-
- -
-
-pyCNN_LSTM.blocks.squeeze_excitation_module(x, dense_units)[source]
-

Squeeze-and-Excitation Module.

-

References

-

Squeeze-and-Excitation Networks, Jie Hu, Li Shen, Samuel Albanie, Gang Sun, Enhua Wu (arXiv:1709.01507v4)

- --- - - - - - - - -
Parameters:
    -
  • x (keras.Tensor) – The input tensor.
  • -
  • dense_units (integer) – The number units on each dense layer.
  • -
-
Returns:

se – Output tensor for the block.

-
Return type:

Keras.Tensor

-
-
- -
-
-pyCNN_LSTM.blocks.conv_block_YiboGao(in_x, nb_filter, kernel_size)[source]
-

Convolutional block of YiboGao’s model

-
- -
-
-pyCNN_LSTM.blocks.attention_branch_YiboGao(in_x, nb_filter, kernel_size)[source]
-

Attention bronch of YiboGao’s model

-
- -
-
-pyCNN_LSTM.blocks.RTA_block(in_x, nb_filter, kernel_size)[source]
-

Residual-based Temporal Attention (RTA) block.

-

References

-

Gao, Y., Wang, H., & Liu, Z. (2021). An end-to-end atrial fibrillation detection by a novel residual-based -temporal attention convolutional neural network with exponential nonlinearity loss. -Knowledge-Based Systems, 212, 106589.

-
- -
-
-pyCNN_LSTM.blocks.spatial_attention_block_ZhangJin(decrease_ratio, x)[source]
-

Spatial attention module of ZhangJin’s model

-
- -
-
-pyCNN_LSTM.blocks.temporal_attention_block_ZhangJin(x)[source]
-

Temporal attention module of ZhangJin’s Model.

-
- @@ -216,6 +91,7 @@

Related Topics

diff --git a/docs/_build/html/blocks_pytorch.html b/docs/_build/html/blocks_pytorch.html index 69ad88c..7fc3e86 100644 --- a/docs/_build/html/blocks_pytorch.html +++ b/docs/_build/html/blocks_pytorch.html @@ -5,7 +5,7 @@ - PyTorch Blocks — pyCNN-LSTM documentation + PyTorch Blocks — TSFEDL documentation + @@ -49,20 +50,20 @@

Data Functions -pyCNN_LSTM.data.get_mit_bih_segments(data, …) +TSFEDL.data.get_mit_bih_segments(data, …) It generates the segments of uninterrupted sequences of arrythmia beats into the corresponding arrythmia groups in labels. -pyCNN_LSTM.data.read_mit_bih(path, labels, , …) +TSFEDL.data.read_mit_bih(path, labels, , , , ]) It reads the MIT-BIH Arrythmia X with the specified default configuration of the work presented at: Oh, Shu Lih, et al. -pyCNN_LSTM.data.MIT_BIH(path[, labels, …]) +TSFEDL.data.MIT_BIH(path[, labels, dtype, …]) Reads the MIT-BIH datasets and return a data loader with Shape (N, C, L) where N is the batch size, C is the number of channels (1 in this dataset) and L is the length of the time series (1000 by default).
-
-pyCNN_LSTM.data.get_mit_bih_segments(data: wfdb.io.record.Record, annotations: wfdb.io.annotation.Annotation, labels: numpy.ndarray, left_offset: int = 99, right_offset: int = 160, fixed_length: typing.Union[int, NoneType] = None) → typing.Tuple[numpy.ndarray, numpy.ndarray][source]
+
+TSFEDL.data.get_mit_bih_segments(data: wfdb.io.record.Record, annotations: wfdb.io.annotation.Annotation, labels: numpy.ndarray, left_offset: int = 99, right_offset: int = 160, fixed_length: typing.Union[int, NoneType] = None) → typing.Tuple[numpy.ndarray, numpy.ndarray][source]

It generates the segments of uninterrupted sequences of arrythmia beats into the corresponding arrythmia groups in labels.

@@ -92,8 +93,8 @@

Data Functions -
-pyCNN_LSTM.data.read_mit_bih(path: str, labels: numpy.ndarray = array(['N', 'L', 'R', 'A', 'V'], dtype='<U1'), left_offset: int = 99, right_offset: int = 160, fixed_length: typing.Union[int, NoneType] = 1000) → typing.Tuple[numpy.ndarray, numpy.ndarray][source]
+
+TSFEDL.data.read_mit_bih(path: str, labels: numpy.ndarray = array(['N', 'L', 'R', 'A', 'V'], dtype='<U1'), left_offset: int = 99, right_offset: int = 160, fixed_length: typing.Union[int, NoneType] = 1000) → typing.Tuple[numpy.ndarray, numpy.ndarray][source]

It reads the MIT-BIH Arrythmia X with the specified default configuration of the work presented at: Oh, Shu Lih, et al. “Automated diagnosis of arrhythmia using combination of CNN and LSTM techniques with variable length heart beats.” Computers in biology and medicine 102 (2018): 278-287.

@@ -123,8 +124,8 @@

Data Functions -
-pyCNN_LSTM.data.MIT_BIH(*args, **kwds)[source]
+
+TSFEDL.data.MIT_BIH(*args, **kwds)[source]

Reads the MIT-BIH datasets and return a data loader with Shape (N, C, L) where N is the batch size, C is the number of channels (1 in this dataset) and L is the length of the time series (1000 by default).

@@ -141,6 +142,7 @@

Related Topics

diff --git a/docs/_build/html/genindex.html b/docs/_build/html/genindex.html index 8d0aeee..e8d16b4 100644 --- a/docs/_build/html/genindex.html +++ b/docs/_build/html/genindex.html @@ -6,7 +6,7 @@ - Index — pyCNN-LSTM documentation + Index — TSFEDL documentation