You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm using RgtSvm to make SVM models for large dataset (nrows=2500000).
The problem occurs only with particular number of columns.
For example:
1-10 columns with 2,5 millions of rows -> No problem creating the SVM model
11-19 columns with 2,5 millions of rows -> Error: An iteration made no progress
Error in gtsvmtrain.classfication.call(y, x, param, verbose = verbose) : Error in GPU process.
20-132 columns with 2,5 millions of rows -> No problem creating the SVM model
Any ideas?
There should be no semantic error, because I'm attaching the columns in a for loops, so the dataset has always the same representation: | feature 1 | features 2 | ... | target column |
The text was updated successfully, but these errors were encountered:
I'm using RgtSvm to make SVM models for large dataset (nrows=2500000).
The problem occurs only with particular number of columns.
For example:
Error: An iteration made no progress
Error in gtsvmtrain.classfication.call(y, x, param, verbose = verbose) : Error in GPU process.
Any ideas?
There should be no semantic error, because I'm attaching the columns in a for loops, so the dataset has always the same representation:
| feature 1 | features 2 | ... | target column |
The text was updated successfully, but these errors were encountered: