Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

System.ArgumentNullException on large batch processing sets #98

Open
sharrken opened this issue Jun 14, 2018 · 2 comments
Open

System.ArgumentNullException on large batch processing sets #98

sharrken opened this issue Jun 14, 2018 · 2 comments

Comments

@sharrken
Copy link

When processing large numbers of files with the batch process option, I'm having repeatable failures which halt processing.

Error as follows:

[#-------------------] 28525/532893 5.4% \System.ArgumentNullException: Value cannot be null.
   at System.Threading.Monitor.Enter(Object obj)
   at VGAudio.Cli.ProgressBar.TimerHandler(Object state)
   at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
   at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
   at System.Threading.TimerQueueTimer.CallCallback()
   at System.Threading.TimerQueueTimer.Fire()
   at System.Threading.TimerQueue.FireNextTimers()

Command:

VGAudioCli.exe -b -r -i "C:\in\" -o "C:\out\" --out-format wav

OS:
Win 10 Pro Workstation, 1803, 17134.81

Data is ~550,000 ATRAC9 files being converted to PCM 16-bit WAV's. At lower batch sizes - around 50,000 - I am still getting the exact same errors, but inconsistently - some of the time it will complete successfully, sometimes fail. On the full set, I usually get the error ~3-5% into the set, on the 50,000 subset I have had errors ~60%, but also successes.

@Thealexbarney
Copy link
Owner

If I'm reading this correctly, the error occurs on the full set 100% of the time?

@sharrken
Copy link
Author

In my experience yes, but statistically probably not.

Basically, it seems that on a large batch, this exception has X probability of happening. With the 50,000 file sub-sets I can, by repeating the set multiple times, get it to process all files successfully without the exception occurring - on some sub-sets this is taking ~5-10 runs though, with failures anywhere from 0.9% into the high 80%'s. This may mean that the exception has a chance of happening even on small batches, but the probability is so low you will never encounter it on smaller batch sizes, or it may only come into play once a batch is beyond a certain size.

Obviously with the way probabilities work, I could possibly get the full 550,000 set to complete if I gave it enough runs, but the likelihood is small enough it is more practical to run the sub-sets even at a 1:10 success rate than run the full 550,000 at a 1:100 or 1:1000 or whatever it is.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants