Skip to content

Commit

Permalink
Release 1.0.1 (#197)
Browse files Browse the repository at this point in the history
* Remove extra whitespace from line ends

* Allow vertical surfaces to be found between any or multiple layers, rather than next consecutive layer in list

* Migrate outputs to persistent storage & database them (#153)

* #141 initial

* #141 configs as string

* #141 source posting lambda

* #141 (wip) start using dynamic params artifact

* #141 (wip) parameters artifact

* #141 (wip) fix params artifact lambda

* #141 (wip) use sh-style sourcing

* #141 (wip) trust just-written params file

* #141 (wip) remove old parameter setting

* #141 (wip) posix facts

* #141 (wip) missing file

* #141 (wip) typo

* #141 (wip) checking

* #141 (wip) get params into build env before using them :P

* #141 (wip) debugging unrecognized params

* #141 (wip) config syntax

* #141 (wip) capability syntax, reference typo

* #141 (wip) build lambda for migration

* #141 (wip) update cfn pipeline to add deploy package info

* #141 (wip) hopefully get outputs in place

* #141 (wip) persistent ci outputs also conditional

* #141 (wip) missing comma

* #141 (wip) output migration

* #141 (wip) dynamo env vars to lamdba

* #141 (wip) dont use nested attributes

* #141 add a script for downloading latest reconstructions

* #141 rely on lifecycle rule to clear out landing buckets (reruns are easier, since you can download -> upload)

* GH-141 record raw swc key (not using in intermediate stages right now to avoid breakage before release)

* #151 add backoff to landing stage

* #151 add backoff to all ecs tasks

* GH-151 if no slice transform provided, default it to None

* #151 retry in an additional error case (ECS.InvalidParameterException resulting from ec2 rate limits)

* Feature adaptive scale (#121)

* Remove extra whitespace at line ends

* Improve performance of intersection search by adaptive scale

* trigger ci

Co-authored-by: Matt Aitken <matthewaitken2@gmail.com>

* Remove trailing whitespace

* Fix number of stems with node_types

* Fix stem exit features

* Add tests for stem features

* Add additional stem for test

* Fix bugs with number of branches and max branch order

* Add tests for num_branches and max_branch_order

* Remove trailing whitespace

* Fix contraction bug with node_types specified

* Fix max path length bugs

* Add tests for contraction and path length

* Modify early_branch_path to use new _calculate_max_path_distance

* GH-160 merges stem exit test files

Moves new stem exit and distance test from @gouwens into the existing test file. Uses MorphologyBuilder to construct the morphology. Updates the existing test to account for new rootwise output list.

* GH-160 migrate new number of stems test

* GH-161 colocate intrinsic feature tests

* GH-161 max branch order cleanup

* GH-162 GH-163 colocate path features tests

* modifies snap polygons to work on the entire June 2020 release set

checkpoint

bit of refactoring

bit of refactoring

documentation, tweaks for cortical surface trimming

provide mp resolver to get snapped polys test

* adds tests and documentation for snap polygons update

tests for remove_duplicates

tests for find transition

tests for first_met

remaining cortex surface tests

additional geometries testing

more docs, minor tweaks, schema params

linting, typing, docs

snap polygons integration tests

* updates argschema to 3.0a candidate branch (#166)

* updates argschema to 3.0a candidate branch

* changes argschema requirement to pypi package

* uses specific pypi argschema prerelease

* removes test_mac release requirement and WIP from readme (#172)

* update version to 1.0.0

* GH-174 switch to autoapi

* adds optional reconstruction_ids argument and downloads raw and marke… (#173)

* adds optional reconstruction_ids argument and downloads raw and marker files as well

* uses specimen_ids instead of reconstruction_ids in filenames

* updates help messages

* moves ccf reference into new persistent reference bucket (#176)

moves ccf reference into new persistent reference bucket

* Update version to 1.0.1

* Bugfix/1.0.1 failing tests (#198)

* set default filemode to "w" for feature_writer

* add location constraint to s3 mock bucket

Co-authored-by: Nathan Gouwens <nathang@alleninstitute.org>
Co-authored-by: NileGraddis <NileGraddis@Gmail.com>
  • Loading branch information
3 people authored Aug 12, 2021
1 parent dcd5073 commit bc0f881
Show file tree
Hide file tree
Showing 48 changed files with 8,250 additions and 514 deletions.
2 changes: 1 addition & 1 deletion .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -186,7 +186,7 @@ workflows:
- deploy_pypi:
requires:
- test_linux
- test_mac
# - test_mac
filters:
tags:
only: /v\d+(\.\d+)*(.[A-Za-z][0-9A-Za-z]*)*/
Expand Down
2 changes: 0 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,6 @@
neuron_morphology
=================

**[WIP] This codebase is a work in progress!** We plan to release the first version in the near future. In the meantime, we can't rule out bugs and API instabilities.

A package for working with single-neuron morphological reconstruction data, such as those in the [Allen Cell Types Database](https://celltypes.brain-map.org/). Provides tools for processing, visualizing, and analyzing such reconstructions.

For usage and installation instructions, see the [documentation](https://neuron-morphology.readthedocs.io/en/latest/).
Expand Down
3 changes: 2 additions & 1 deletion doc_requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -12,4 +12,5 @@ rasterio<2.0.0
shapely<2.0.0
imageio<3.0.0
pg8000<2.0.0 # required for running from the Allen Institute's internal database
jupyter<5.0.0
jupyter<5.0.0
sphinx-autoapi<2.0.0
30 changes: 10 additions & 20 deletions doc_template/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,12 +45,21 @@ def get_version():
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = [
'sphinx.ext.autodoc',
'autoapi.extension',
'sphinx.ext.viewcode',
'sphinx.ext.autosummary',
'numpydoc'
]

# setup autoapi
autoapi_type = 'python'
autoapi_dirs = [
os.path.join(
os.path.dirname(os.path.dirname(__file__)),
'neuron_morphology'
)
]

# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']

Expand All @@ -77,24 +86,6 @@ def get_version():

master_doc = "index"

autodoc_mock_imports = ["allensdk"]


def run_apidoc(*a):
parent = os.path.dirname(__file__)
grandparent = os.path.dirname(parent)
package = os.path.join(grandparent, "neuron_morphology")

sys.path = [grandparent] + sys.path
sp.check_call([
"sphinx-apidoc",
"-e",
"-o",
parent,
"--force",
package
])


def render_notebooks(_):
notebooks = os.path.join(
Expand All @@ -118,6 +109,5 @@ def render_notebooks(_):


def setup(app):
app.connect("builder-inited", run_apidoc)
app.connect('builder-inited', render_notebooks)

2 changes: 1 addition & 1 deletion neuron_morphology/VERSION.txt
Original file line number Diff line number Diff line change
@@ -1 +1 @@
1.0.0.beta0
1.0.1
5 changes: 3 additions & 2 deletions neuron_morphology/feature_extractor/feature_writer.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,8 @@ def __init__(
self,
heavy_path: str,
table_path: Optional[str] = None,
formatters: Optional[Iterable["FeatureFormatter"]] = None
formatters: Optional[Iterable["FeatureFormatter"]] = None,
filemode: Optional[str] = 'w'
):
""" Formats and writes feature extraction outputs
Expand All @@ -51,7 +52,7 @@ def __init__(
self.output: Dict[str, Any] = {}

self.validate_table_extension()
self.heavy_file = h5py.File(self.heavy_path, driver="core")
self.heavy_file = h5py.File(self.heavy_path, filemode, driver="core")


def add_run(self, identifier: str, run: Dict[str, Any]):
Expand Down
46 changes: 28 additions & 18 deletions neuron_morphology/features/intrinsic.py
Original file line number Diff line number Diff line change
Expand Up @@ -118,11 +118,16 @@ def num_branches(
"""
morphology = data.morphology
roots = morphology.get_roots()
nodes = morphology.get_node_by_types(node_types)
roots = morphology.get_roots_for_nodes(nodes)
num_branches = 0
for root in roots:
num_branches += calculate_branches_from_root(
morphology, root, node_types=node_types)
if (morphology.parent_of(root) is not None and
len(morphology.get_children_of_node_by_types(root, node_types)) > 1):
# if root is a branching node, include the branch that connects root to tree
num_branches += 1
return num_branches


Expand Down Expand Up @@ -214,21 +219,29 @@ def calculate_max_branch_order_from_root(morphology,
root,
node_types=None):
"""
Calculate the maximum number of branches from a root to a tip
in a morphology. A branch is defined as being between
two bifurcations or between a bifurcation and a tip
Unlike mean_fragmentation and num_branches, if a node has
multiple children it is counted as a single bifurcation point
Calculate the greatest number of branches encountered among all
directed paths from the morphology's root to its leaves. A branch is
defined as a root->leaf ordered path for which:
1. the first node on the path is either
a. a bifurcation (has > 1 children)
b. the root node
2. the last node on the path is either
a. a bifurcation
b. a leaf node (has 0 children)
Parameters
----------
morphology: the reconstruction whose max branch order will be
calculated
root: treat this node as root
node_types: If not None, consider only root->leaf paths whose leaf
nodes are among these types (see neuron_morphology constants)
morphology: a morphology object
root: the root node to traverse from
Returns
-------
The greatest branch count encountered among all considered root->leaf
paths
node_types: a list of node types (see neuron_morphology constants)
"""

root_id = morphology.node_id_cb(root)
Expand All @@ -237,7 +250,7 @@ def calculate_max_branch_order_from_root(morphology,

def branch_visitor(node, counter, node_types):
cur_branches = counter['branches_to_node'][node['id']]
children = morphology.get_children(node, node_types)
children = morphology.get_children(node)
num_children = len(children)

if num_children > 1:
Expand All @@ -250,19 +263,16 @@ def branch_visitor(node, counter, node_types):
counter['branches_to_node'][children[0]['id']] = 1
else:
counter['branches_to_node'][children[0]['id']] = cur_branches
elif num_children == 0:
elif num_children == 0 and (node_types is None or node['type'] in node_types):
if cur_branches > counter['max_branches']:
counter['max_branches'] = cur_branches

visitor = partial(branch_visitor,
counter=counter,
node_types=node_types)
neighbor_cb = partial(child_ids_by_type,
morphology=morphology,
node_types=node_types)
morphology.depth_first_traversal(visitor,
start_id=root_id,
neighbor_cb=neighbor_cb)
start_id=root_id)


return counter['max_branches']

Expand Down
68 changes: 37 additions & 31 deletions neuron_morphology/features/path.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
MorphologyLike, get_morphology)


# TODO: There is a breadth_first_traversal method defined on Morphology. We
# TODO: There is a breadth_first_traversal method defined on Morphology. We
# should use that here
def _calculate_max_path_distance(morphology, root, node_types):
# if root not specified, grab the soma root if it exists, and the
Expand All @@ -18,11 +18,11 @@ def _calculate_max_path_distance(morphology, root, node_types):
if node_types is None:
node_types = [SOMA, AXON, APICAL_DENDRITE, BASAL_DENDRITE]

nodes = morphology.get_node_by_types(node_types)
if root is None:
root = morphology.get_root()
total_length = 0.0
# sum up length for all child compartments
max_tip_type = None
while len(morphology.get_children(root)) > 0:
# the next node is a continuation from this node (ie, no
# bifurcation). update path length and evaluate the
Expand All @@ -33,8 +33,8 @@ def _calculate_max_path_distance(morphology, root, node_types):
if len(morphology.get_children(root)) == 1:
# get length of associated compartment, if it exists, and if
# it's not soma
if root['type'] != SOMA and root['type'] in node_types:
compartment = morphology.get_compartment_for_node(root, node_types)
if root['type'] != SOMA:
compartment = morphology.get_compartment_for_node(root)
if compartment:
total_length += morphology.get_compartment_length(compartment)
root = morphology.get_children(root)[0]
Expand All @@ -43,57 +43,63 @@ def _calculate_max_path_distance(morphology, root, node_types):
# recurse to find length of each child branch and then
# exit loop
max_sub_dist = 0.0
children_of_root = morphology.get_children(root, node_types)
children_of_root = morphology.get_children(root)
for child in children_of_root:
dist = _calculate_max_path_distance(morphology, child, node_types)
if dist > max_sub_dist:
dist, tip_type = _calculate_max_path_distance(morphology, child, node_types)
if dist > max_sub_dist and tip_type in node_types:
max_sub_dist = dist
max_tip_type = tip_type
total_length += max_sub_dist
break
# the length of this compartment hasn't been included yet, and if it
# isn't part of the soma
if max_tip_type is None:
max_tip_type = root["type"]
if root['type'] != SOMA:
compartment = morphology.get_compartment_for_node(root, node_types)
compartment = morphology.get_compartment_for_node(root)
if compartment:
total_length += morphology.get_compartment_length(compartment)
return total_length
return total_length, max_tip_type


def calculate_max_path_distance(morphology, root=None, node_types=None):
def calculate_max_path_distance(morphology, root, node_types=None):
""" Helper for max_path_distance. See below for more information.
"""

max_path = 0.0
roots = morphology.get_roots_for_analysis(root, node_types)
if roots is None:
root_children = morphology.get_children(root)
if root_children is None:
return float('nan')
for node in roots:
path = _calculate_max_path_distance(morphology, node, node_types)
for node in root_children:
path, tip_type = _calculate_max_path_distance(morphology, node, node_types)
if path > max_path:
max_path = path
if node_types and tip_type in node_types:
max_path = path
else:
max_path = path
return max_path


@marked(RequiresRoot)
@marked(Geometric)
def max_path_distance(
data: MorphologyLike,
node_types: Optional[List[int]] = None
node_types: Optional[List[int]] = None
) -> float:

""" Calculate the distance, following the path of adjacent neurites, from
the soma to the furthest compartment. This is equivalent to the distance
""" Calculate the distance, following the path of adjacent neurites, from
the soma to the furthest compartment. This is equivalent to the distance
to the furthest SWC node.
Parameters
----------
data : the input reconstruction
node_types : if provided, restrict the calculation to nodes of these
node_types : if provided, restrict the calculation to nodes of these
types
Returns
-------
The along-path distance from the soma to the farthest (in the along-path
The along-path distance from the soma to the farthest (in the along-path
sense) node.
"""
Expand All @@ -113,18 +119,18 @@ def early_branch_path(
node_types: Optional[List[int]] = None,
soma: Optional[Dict] = None
) -> float:
""" Returns the ratio of the longest 'short' branch from a bifurcation to
the maximum path length of the tree. In other words, for each bifurcation,
the maximum path length below that branch is calculated, and the shorter of
these values is used. The maximum of these short values is divided by the
""" Returns the ratio of the longest 'short' branch from a bifurcation to
the maximum path length of the tree. In other words, for each bifurcation,
the maximum path length below that branch is calculated, and the shorter of
these values is used. The maximum of these short values is divided by the
maximum path length.
Parameters
----------
data : the input reconstruction
node_types : if provided, restrict the calculation to nodes of these
node_types : if provided, restrict the calculation to nodes of these
types
soma : if provided, use this node as the root, otherwise infer the root
soma : if provided, use this node as the root, otherwise infer the root
from the argued morphology
Returns
Expand All @@ -136,7 +142,7 @@ def early_branch_path(
morphology = get_morphology(data)
soma = soma or morphology.get_root()

path_len = _calculate_max_path_distance(morphology, soma, node_types)
path_len = _calculate_max_path_distance(morphology, soma, node_types)[0]
if path_len == 0:
return 0.0

Expand All @@ -148,7 +154,7 @@ def early_branch_path(
continue

current_short = min(
_calculate_max_path_distance(morphology, child, node_types)
_calculate_max_path_distance(morphology, child, node_types)[0]
for child in morphology.children_of(node)
)

Expand Down Expand Up @@ -240,7 +246,7 @@ def calculate_mean_contraction(morphology, root=None, node_types=None):
@marked(RequiresRoot)
def mean_contraction(
data: MorphologyLike,
node_types: Optional[List[int]] = None
node_types: Optional[List[int]] = None
) -> float:
""" Calculate the average contraction of all sections. In other words,
calculate the average ratio of euclidean distance to path distance
Expand All @@ -250,7 +256,7 @@ def mean_contraction(
Parameters
----------
data : the input reconstruction
node_types : if provided, restrict the calculation to nodes of these
node_types : if provided, restrict the calculation to nodes of these
types
Returns
Expand All @@ -262,6 +268,6 @@ def mean_contraction(
morphology = get_morphology(data)
return calculate_mean_contraction(
morphology,
morphology.get_root(),
None,
node_types
)
Loading

0 comments on commit bc0f881

Please sign in to comment.