-
Notifications
You must be signed in to change notification settings - Fork 207
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Downbeat tracking: support weighting for time signatures #402
Comments
Yes, that would be possible.
|
Thanks @superbock I'll have a go at working that out. For some reason I'm getting a lot fewer misidentifications now but I guess it's a good thing to have available. Is this something you'd consider in a PR? |
Definitely yes! |
* Optional parameter, implicitly defaults to ones for the array * Clean up the handling of lengths into the constructor, it was getting verbose * Check weights don't sum to zero, to avoid divide-by-zero pain. * Weight the HMM results in log space by normalised weight values, as suggested by @superbock * Add new test to prove that (sufficient, but arbitrary) weighting to 3-time (over 4-time) does indeed return 3-time beats results. This fixes CPJKU#402.
how can i detect bar line or measure line from audio/music? |
You can use the downbeat tracking module ( |
can you please give me an example of code, so that I can understand better. |
(Almost) all classes come with examples. For a running example, please see |
okay, Thank you very much!. |
Yes, simply check what the highest beat number is. The output is in the format |
okay, Thanks! |
Please do not edit comments in such a way that the content changes considerably! The comments now read like I was not answering your question at all; but you were asking on how to find the time signature of a song, and I answered that question. Also, please do not use existing issues to ask something completely unrelated to this issue at hand. But to answer your question: madmom does not contain functionality to discover song structure. |
okay Thanks! |
Use case
When I'm running
DBNDownBeatTrackingProcessor
, I'm, passing parametersbeats_per_bar=[3, 4, 6]
to detect from a mixed bunch over many genres (but mostly pop / rock / soul / dance music).The vast majority is in common (4) time, with a few 6/8 plus a few "difficult" ones. However, by including especially
6
in this list, I find the processor occasionally misidentifying 4/4 pieces, especially rock (and heavier) as having six beats per downbeat in the output.Workarounds attempted
[40, 100, 40]
in the hope that this would weight the probability of, but this felt like it was perhaps harming the downbeat detection overall - is that right? Which parameters need models to be retrained etc?Suggestion
Would it be possible to provide weightings or other forms of influencing from the selection of beat per bar given in an array - e.g. just
weights=[1,10,2]
or similar?Thanks
The text was updated successfully, but these errors were encountered: