Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ENH] - Add SpectralTimeModel and SpectralTimeEventModel #283

Merged
merged 110 commits into from
Mar 27, 2024
Merged

Conversation

TomDonoghue
Copy link
Member

@TomDonoghue TomDonoghue commented Jul 12, 2023

Overview

This PR adds new model objects to manage applying the spectral model across time (spectrograms), and across events whereby each event is across time (multiple spectrograms). It also includes associated additions / updates / re-organizations to help with the extension to explicitly supporting applying the spectral model over time windows, and code updates related to better supporting the updated strategy of having more / different model objects for different use cases.

Overall, this PR adds new model objects SpectralTimeModel, and SpectralTimeEventModel, which closely mirror our previous objects. Notably, SpectralTimeModel is really a special case of SpectralGroupModel to take spectrograms, and SpectralTimeEventModel extends this functionality to support analyses across multiple spectrograms. Practical updates to the new objects include updates to things like the plots / reports in order to best describe / work with the parameters given the new context of estimations (eg. plotting parameters over windows).

Perhaps the most notable update (to be able to do this) is to managing peak parameters, for which we have to choose a strategy for organizing peaks across models. To do so, here we follow the logic started / developed for exporting model results to dataframes (#196) - in this approach the main / suggested approach is to provide a "Bands" object definition, which is then used to organize parameters by defined bands. We should perhaps keep in mind / consider the pros / cons of this in terms of things like how to do so we select a single dominant peak (which may not properly reflect bandwidth, if there are overlapping peaks, etc). Note that while a new time_results (or event_time_results) attribute is added and used here, the full group_results (or event_group_results) are maintained, such that no information is lost / discarded (just sub-selected from).

Examples

Overview examples of the two new objects:

SpectralTimeModel

Quick version example of SpectralTimeModel - given a spectrogram of PSDs over time windows, this can be parameterized as such:

from specparam import SpectralTimeModel, Bands

# Define a bands object to organize peak parameters
bands = Bands({'alpha' : [7, 14]})

# Initialize the spectral model
ft = SpectralTimeModel()

# Fit the spectrogram and print out report
ft.report(freqs, spectrogram, peak_org=bands)

This would give the following output:
Screen Shot 2023-07-13 at 6 41 30 PM

SpectralTimeEventModel

Quick version example of SpectralTimeEVentModel - given a list of spectrograms of PSDs over time windows across a set of events (assuming the same set of windows across events), this can be parameterized as such:

from specparam import SpectralTimeEventModel, Bands

# Define a bands object to organize peak parameters
bands = Bands({'alpha' : [7, 14]})

# Initialize the spectral model
fe = SpectralTimeEventModel()

# Fit the spectrogram and print out report
fe.report(freqs, spectrograms, peak_org=bands)

This would give the following output:
Screen Shot 2023-07-13 at 6 41 43 PM

Full example code that can be run

For testing & exploration, this PR includes associated updates, such as adding a sim_spectrogram function, which can be used to explore this functionality.

Example self-sufficient code to run these examples:

from specparam import SpectralTimeModel, SpectralTimeEventModel, Bands
from specparam.sim import sim_spectrogram
from specparam.plts.spectra import plot_spectrogram

# Create & plot an example spectrogram
n_pre_post = 50
freq_range = [3, 25]
ap_params = [[1, 1.5]] * n_pre_post + [[1, 1]] * n_pre_post
pe_params = [[10, 1.5, 2.5]] * n_pre_post + [[10, 0.5, 2.]] * n_pre_post
freqs, spectrogram = sim_spectrogram(n_pre_post * 2, freq_range, ap_params, pe_params, nlvs=0.1)
plot_spectrogram(freqs, spectrogram)

# Test out the SpectralTimeModelObject
bands = Bands({'alpha' : [7, 14]})
ft = SpectralTimeModel()
ft.report(freqs, spectrogram, peak_org=bands)

# Simulate & collect together multiple spectrograms to represent 'events'
freqs, spectrogram1 = sim_spectrogram(n_pre_post * 2, freq_range, ap_params, pe_params, nlvs=0.1)
freqs, spectrogram2 = sim_spectrogram(n_pre_post * 2, freq_range, ap_params, pe_params, nlvs=0.1)
freqs, spectrogram3 = sim_spectrogram(n_pre_post * 2, freq_range, ap_params, pe_params, nlvs=0.1)
spectrograms = [spectrogram1, spectrogram2, spectrogram3]

# Test out the SpectralTimeModel
fe = SpectralTimeEventModel(verbose=False)
fe.report(freqs, spectrograms, peak_org=bands)

Additional Notes

Additional notes & comments:

  • The new model objects support all the same model settings and modes as previous (eg. can fit with a knee, and everything is updated accordingly).
  • The new objects are also updated such that all the attributes & methods are updated to work the same or equivalently as they do in the inherited-from objects (eg. get_params, get_results, get_model, etc)
  • Both objects are flexible in terms of bands objects, and reports and so on will dynamically update for the number of defined bands
  • A key part of working on this update was choosing and designing what the outputs should look like, so any comments and suggestions on the report info / plots / layout is appreciated!
  • This PR does not include any changes to the underlying algorithm changes, and does not (yet) include any new post-processing (eg. smoothing of parameters across windows post estimations) and/or estimation tweaks (eg using adjacent window estimates [ENH]: Sliding window: Use previous params as guesses for next fit #277) are not added here. This PR serves as the outline for the model objects and API upon which such extensions can be added.
  • This PR also updates actions to run on all PRs (not just main), and also updates minimum tested / supported Python to 3.7 (from 3.6). This min version change is related to a change that makes one of the tests fail - but it's probably a minor thing to change / update to support - so we could tweak to maintain 3.6 support if we want (?).

Reviewing

In terms of reviewing - I've tagged core development related people (@rdgao, @ryanhammonds, and @voytek) for explicit review - but I would suggest that before we do the deep dive for any nitty-gritty code checks, that a first sweep through for people to have a look, and try it out, and comment any high-level ideas about the general strategy taken here. Once we've done this kind of check, and other people have run the basics and if we decide we like this approach, then we can coordinate for any detailed code review stuff.

As part of this "general check-in" review, it would be great to here from people who are also on the user side of things, for which I'm tagging in @mwprestonjr @benderas7 @dillancellier @sydney-smith @QuirinevEngen (though anyone untagged is more than welcome to also give comments / and no worries at all if people don't have time to try this out right now).

To install and test this new functionality:

# Clone the branch with these updates (gets specific branch & skips full history, for efficiency)
git clone --branch time --single-branch --depth 1 https://github.com/fooof-tools/fooof
# Move directory and install the current branch
cd fooof
pip install .

Once installed, you can open a notebook (or however you prefer) and start with the example code section above that should run and give you example reports matching what I pasted in. It would also be great if people can try it out on any real data they have that matches these use cases (as I have not done this yet).

@fooof-tools fooof-tools deleted a comment from codecov-commenter Mar 26, 2024
@fooof-tools fooof-tools deleted a comment from codecov-commenter Mar 26, 2024
@fooof-tools fooof-tools deleted a comment from codecov-commenter Mar 27, 2024
@TomDonoghue
Copy link
Member Author

@benderas7 - a big belated thank you for your deep dive here, and apologies it took me so long to get back to working on this update again! Your comments were super helpful, and helped fix some specific things, as opening up some new tests / ideas that also lead to further tweaks and improvements.

As a quick overview of the recent specific changes based on your comments:

  • re 4) I added a get_band_peak_event function
  • re 6) plotting: you were right that my early tests didn't address variable peak presence very well. This is now improved, most notably by uses nan-robust means and shades, such that the data doesn't disappear as soon as there are some "missing" peaks, and two by adding a 'presence' plot to the report, as you suggested, to be able to see the peak presence across time windows

And a couple more comments:

  • re 3), the reason the output of fe.get_params is a list, is that the array of peaks per event can be different shapes, and so they can't be stacked into an evenly shaped array. This is annoying, including that it makes the output of this method different from the equivalent method of other objects. For the moment, I've left it like this, though I might try and revisit this to see if there is a way to clean it up. Regardless, hopefully the new built in dataframe exporters means that you don't need to be manually organizing the parameters into dataframes anymore

@fooof-tools fooof-tools deleted a comment from codecov-commenter Mar 27, 2024
@TomDonoghue
Copy link
Member Author

Okay - I've made a series of relatively minor updates here, though nothing too major. Instead of trying to tag people back in for a review of this rather massive PR, I'm going to merge this in, tag a new release candidate, and we can check and tweak from there.

To be clear - it's not that I don't want further checks / suggestions, etc, precisely the opposite - I think the better way to try and check this update from here is to merge it, get people trying it, and then track any quirks / work on improvements from there!

@fooof-tools fooof-tools deleted a comment from codecov-commenter Mar 27, 2024
@TomDonoghue TomDonoghue merged commit 809a3cf into main Mar 27, 2024
6 checks passed
@TomDonoghue TomDonoghue deleted the time branch March 27, 2024 00:59
@TomDonoghue TomDonoghue mentioned this pull request Apr 4, 2024
11 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
2.0 Targetted for the specparam 2.0 release.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants