-
Notifications
You must be signed in to change notification settings - Fork 96
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error When Running Data Quality Check - 'names' attribute [1] must be the same length as the vector [0] #562
Comments
Hi - Some additional info from some testing.... When I remove "Field" from CheckLevels, we get the output json. checkLevels <- c("TABLE", "FIELD", "CONCEPT") checkLevels <- c("TABLE", "CONCEPT") Any advice on zeroing in on this issue? |
Hi there, can you please share the full script you ran that resulted in the error (i.e., values of parameters passed into executeDqChecks)? And the version of the package you're using? Thanks. |
Hi @katy-sadowski. Sorry about the late reply. Here it is. Thanks for offering to help! Sorry it's a little messy, just copied what I had without modifying it incase there were any other clues in there. #--------INSTALL FEATURE EXTRACTION PACKAGE
if (!require("drat")) install.packages("drat")
drat::addRepo("OHDSI")
if (!require("remotes")) install.packages("remotes")
remotes::install_github("OHDSI/DataQualityDashboard")
if (!require("DatabaseConnector"))install.packages("DatabaseConnector")
# Download Drivers
Sys.setenv(DATABASECONNECTOR_JAR_FOLDER = "//home//ohdsi")
#-------IMPORT PACKAGES
library("DataQualityDashboard")
library("DatabaseConnector")
#-------CONFIG CONNECTIONS
downloadJdbcDrivers("spark")
connectionDetails <- createConnectionDetails(
dbms="spark",
connectionString="jdbc:spark://...UID=token;UseNativeQuery=1",
user="token",
password="...")
cdmDatabaseSchema = "omop"
resultsDatabaseSchema = "omop"
cdmSourceName <- "OMOP CDM v5.4 Demo Environment"
cdmVersion <- "5.4"
numThreads <- 1
sqlOnly <- FALSE
sqlOnlyIncrementalInsert <- FALSE
sqlOnlyUnionCount <- 1
outputFolder <- "output"
outputFile <- "results.json"
verboseMode <- TRUE
writeToTable <- FALSE
writeTableName <- ""
writeToCsv <- FALSE
csvFile <- ""
checkLevels <- c("TABLE", "FIELD", "CONCEPT")
allChecks <- DataQualityDashboard::listDqChecks(cdmVersion=cdmVersion)
checkNames <- allChecks$checkDescriptions$checkName
tablesToExclude <- c("CONCEPT", "VOCABULARY", "CONCEPT_ANCESTOR", "CONCEPT_RELATIONSHIP", "CONCEPT_CLASS", "CONCEPT_SYNONYM", "RELATIONSHIP", "DOMAIN", "VISIT_OCCURRENCE", "VISIT_DETAIL")
DataQualityDashboard::executeDqChecks(connectionDetails = connectionDetails,
cdmDatabaseSchema = cdmDatabaseSchema,
resultsDatabaseSchema = resultsDatabaseSchema,
cdmSourceName = cdmSourceName,
cdmVersion = cdmVersion,
numThreads = numThreads,
sqlOnly = sqlOnly,
sqlOnlyUnionCount = sqlOnlyUnionCount,
sqlOnlyIncrementalInsert = sqlOnlyIncrementalInsert,
outputFolder = outputFolder,
outputFile = outputFile,
verboseMode = verboseMode,
writeToTable = writeToTable,
writeToCsv = writeToCsv,
csvFile = csvFile,
checkLevels = checkLevels,
tablesToExclude = tablesToExclude,
checkNames = checkNames)
DataQualityDashboard::viewDqDashboard("~/output/results.json") |
Thanks for the code - can you please confirm what version of the package you're running? |
Hi @katy-sadowski, oops sorry. |
Thanks! Can you try running again with |
Hi! We're testing this library out on top of an OMOP CDM 5.4 in Databricks (Spark), and I'm running into some issues with it's usage. After running DataQualityDashboard::executeDqChecks, I'm receiving the error below. I would like to save the outputFile json and display it in a Shiny app. The outputFile json doesn't generate after this error occurs and we're left with the error below.
Here is the output of the whole run to demonstrate the connection to our DBSQL cluster is successful, and I'm actually able to complete quite a few checks:
Can you help shed some light on this error? We think this Dashboard is going to be very helpful but can't seem to get past this problem. Thanks!
The text was updated successfully, but these errors were encountered: