Skip to content

Commit

Permalink
[apache#2566] feat(spark-connector): Refactoring integration tests fo…
Browse files Browse the repository at this point in the history
…r spark-connector
  • Loading branch information
caican00 committed Mar 19, 2024
1 parent 2847dc4 commit b2f31e8
Showing 1 changed file with 1 addition and 2 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -52,8 +52,7 @@ private static String getInsertWithoutPartitionSql(String tableName, String valu
"struct(1, 'a')");

// Use a custom database not the original default database because SparkCommonIT couldn't
// read&write
// data to tables in default database. The main reason is default database location is
// read&write data to tables in default database. The main reason is default database location is
// determined by `hive.metastore.warehouse.dir` in hive-site.xml which is local HDFS address
// not real HDFS address. The location of tables created under default database is like
// hdfs://localhost:9000/xxx which couldn't read write data from SparkCommonIT. Will use default
Expand Down

0 comments on commit b2f31e8

Please sign in to comment.