This repository has been archived by the owner on Aug 5, 2022. It is now read-only.
forked from bazelbuild/rules_scala
-
Notifications
You must be signed in to change notification settings - Fork 0
sync with upstream #17
Open
dan-busch-gr
wants to merge
497
commits into
ConsultingMD:master
Choose a base branch
from
bazelbuild:master
base: master
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
* Revert scala_import stamping * Revert scala_import stamping tests * Lint
* Fix action_should_fail_with_message sh helper Add missing `exit 1` to make test actaully fail Had to remove `jvm_opts = ["-Xbootclasspath/p:$(location @bazel_tools//third_party/java/jdk/langtools:javac_jar)"],` because test fails for wrong reason `-Xbootclasspath/p is no longer a supported option.` * Fix test_scalac_jvm_flags_are_expanded Prefix flag with a name otherwise it looks it up as class name and fails with Caused by: java.lang.ClassNotFoundException: test_expect_failure.compilers_jvm_flags.args.txt * Rename test flag to clarify intent
* scala_proto: configure ScalaPbCodeGenerator instead of worker * Add test. Need to clean up * Simplify generator * Make custom generator dumb as possible. Ignore inputs and generate fixed file * cleanups * rc: append single element with :+
The sha256 is set for scalap:2.11.12, but the version set was 2.11.10 which caused a hash conflict. $ sha256sum scalap-2.11.12.jar a6dd7203ce4af9d6185023d5dba9993eb8e80584ff4b1f6dec574a2aba4cd2b7 scalap-2.11.12.jar $ sha256sum scalap-2.11.10.jar 3f6f352fce91c33055398a7081e35e5091fd5e095130905633e72f52124c1d27 scalap-2.11.10.jar
* Remove proto binds
* Remove scala repository binds * Lint
* Collect flags/opts in toolchain Do this once. Collect them where with_ setters are defined. * Collect _extra_generator_jars in a toolchain Do it once * Extract function to check if target needs to be skipped * Move protoc to toolchain * Minor change * Make it pass with descriptor sets * Reduce providers * Remove unsued code * remove unsused code * rm unused code * Formatting * toolchains formatting * worker formatting * Pass test_rules_scala.sh * cleanups * format scalapb_aspect.bzl * Pack sources * Replace blacklisted_protos loop with lookup * naming * Thread through toolchain parameter * Move ctx.actions.run from env to jvm_flags * naming * Change _host_javabase cfg from host to exec
* Stamp only compile jars Stamping break signed jars which we keep untouch for runtime classpath * Add scala import analysis test * Add no-ide to analysis test * Refactor * Add setting to control stamping and refactor names * Add stamping setting test * Add stamping docs * Fix doc * Remove tests related to compile only code * Remove tests related to compile only code * Remove unused val
#1212) * Remove test for development version of bazel for the diagnostics file. * remove condition always set to false for diagnostics_file argument
…1220) * Move coverage tests under a separate jobs with fixed bazel version * Fix task names
… version 21.2.0 (#1223)
* Support scrooge-generator compiler flags in thrift_library rule (#6) * Add option parsing file * Add OptionParser * Add scopt dependency * Lint reformat * Add option parsing file * Add scopt dependency * Lint reformat * Add test for compiler_args in thrift_library * Fix lint error * Remove option parser from rules_scala code * Add scopt dependency * Update test and clean code
* Extract strict deps tests into separate shell script * Add test to showcase existing (incorrect) proto stamping behaviour * lint
* Extract common functions for testing scalafmt * Add scalafmt test with multiple Scala versions --------- Co-authored-by: mkuta <[email protected]>
* bumped artifact versions with sha sums and scala versions * reset cla * removed library version update * update sha sums * update scala version in workspace * update scala version in workspace and build * update scala version in workspaces and build * removed scalafmt updated versions * removed docs update * removed support for 3.4.2 due to failing scala3_4_example * removed support for 3.4.2 due to failing scala3_4_example * lint issue * update
…cktraces (#1606) * Improve handling of invalid settings passed to Scala compiler. Throw known exception instead of allowing to None.get (Scala3) or NullPointerException (Scala2) * Prevent printing stack trace of ScalacInvoker on known compilation errors * Run reporter.flush in Scala3 * Restore previous error messages * Addi tests to ensure corect error message and no stack traces are shown * Remove unused code from test_invalid_scalacopts.sh * Print exception message only once. Previously printed in both e.getMessage and e.printStackTrace * Ident with spaces instead of tabs
* scala 3.4.2 support * 3.4.3 hotfix
* extra javac opts should override default ones * add test
Related to #1482, #1618, and #1619. Results from the investigation documented at: - #1619 (comment) Updates `_import_paths()` in `scala_proto_aspect.bzl` to handle differences `ProtoInfo.proto_source_root` and `ProtoInfo.direct_sources` values between Bazel 6 and Bazel 7. Without this change, `_import_paths()` emits incorrect values under Bazel 7, causing targets containing generated `.proto` inputs to fail, e.g. `//test/proto3:test_generated_proto`. See also: - Fix paths for sibling repository setup and generated .proto files bazelbuild/bazel@6c6c196 - The docstring for `ProtoInfo.proto_source_root` in the Bazel sources: https://github.com/bazelbuild/bazel/blob/7.3.2/src/main/starlark/builtins_bzl/common/proto/proto_info.bzl#L155-L172 - Remove incompatible_generated_protos_in_virtual_imports bazelbuild/bazel@3efaa32 - Comment from: Generated Protos are no longer considered as virtual_imports in Bazel 7 bazelbuild/bazel#21075 (comment) --- I cherrypicked this commit into #1618. While it fixed the `//test/proto3` build failure, it does _not_ fix the hanging scalapb_workers from the ProtoScalaPBRule aspect. I'll have to investiate further whether than hang is related to Bazel, rules_proto, com_google_protobuf, or some mixture thereof. Still, progress!
* Add toolchain_type to thrift_lib, scrooge aspects This fixes the following error: ```txt Error in create_header_compilation_action: Rule 'thrift_library' in 'rules_scala/test/src/main/scala/scalarules/test/twitter_scrooge/thrift/ thrift_with_compiler_args/BUILD:3:15' must declare '@@bazel_tools//tools/jdk:toolchain_type' toolchain in order to use java_common. See bazelbuild/bazel#18970. ``` Interestingly, adding the toolchain type to `thrift_library` isn't enough; I had to add it to the twitter_scrooge aspects as well. Until I did, it produced the same error message pointing at `thrift_library`, after first reporting: ```txt ERROR: rules_scala/test/src/main/scala/scalarules/test/twitter_scrooge/ thrift/thrift_with_compiler_args/BUILD:3:15: in //twitter_scrooge:twitter_scrooge.bzl%scrooge_java_aspect aspect on thrift_library rule //test/src/main/scala/scalarules/test/twitter_scrooge/thrift/ thrift_with_compiler_args:thrift5: Traceback (most recent call last): File "rules_scala/twitter_scrooge/twitter_scrooge.bzl", line 420, column 49, in _scrooge_java_aspect_impl return _common_scrooge_aspect_implementation(target, ctx, "java", _compile_generated_java) [ ...snip... ] ``` * Fix proto_cross_repo_boundary build_file attribute Bazel 7 enforces that this be a proper target label, not a relative path. ```txt ERROR: WORKSPACE:91:37: fetching new_local_repository rule //external:proto_cross_repo_boundary: .../external/test/proto_cross_repo_boundary/repo/BUILD.repo is not a regular file; if you're using a relative or absolute path for `build_file` in your `new_local_repository` rule, please switch to using a label instead INFO: Repository remote_jdk8_macos instantiated at: WORKSPACE:175:18: in <toplevel> .../external/rules_java/java/repositories.bzl:120:10: in remote_jdk8_repos .../external/bazel_tools/tools/build_defs/repo/utils.bzl:240:18: in maybe .../external/rules_java/toolchains/remote_java_repository.bzl:48:17: in remote_java_repository Repository rule http_archive defined at: .../external/bazel_tools/tools/build_defs/repo/http.bzl:384:31: in <toplevel> ERROR: no such package '@@proto_cross_repo_boundary//': .../external/test/proto_cross_repo_boundary/repo/BUILD.repo is not a regular file; if you're using a relative or absolute path for `build_file` in your `new_local_repository` rule, please switch to using a label instead ERROR: test/proto_cross_repo_boundary/BUILD:3:22: //test/proto_cross_repo_boundary:sample_scala_proto depends on @@proto_cross_repo_boundary//:sample_proto in repository @@proto_cross_repo_boundary which failed to fetch. no such package '@@proto_cross_repo_boundary//': .../external/test/proto_cross_repo_boundary/repo/BUILD.repo is not a regular file; if you're using a relative or absolute path for `build_file` in your `new_local_repository` rule, please switch to using a label instead Use --verbose_failures to see the command lines of failed build steps. ERROR: Analysis of target '//test/proto_cross_repo_boundary:sample_scala_proto' failed; build aborted: Analysis failed ``` * Add rules_python stanza to WORKSPACE files Fixes the following breakages under Bazel 7: ```txt $ bazel build //src/... ERROR: error loading package '@@bazel_tools//src/main/protobuf': cannot load '@@rules_python//python:proto.bzl': no such file ERROR: .../src/java/io/bazel/rulesscala/coverage/instrumenter/BUILD:3:12: error loading package '@@bazel_tools//src/main/protobuf': cannot load '@@rules_python//python:proto.bzl': no such file and referenced by '//src/java/io/bazel/rulesscala/coverage/instrumenter:instrumenter' ``` * Set --noenable_bzlmod to prevent MODULE.bazel Makes sure Bazel doesn't autogenerate MODULE.bazel and MODULE.bazel.lock files for now. * Update to Bazel 6.5.0 This is the last release in the Bazel 6 line. With the changes up to this point, the Bazel 7 build only fails on protobuf generation, e.g. on `bazel build //test/proto3:all`, as seen below. Bumping to Bazel 6.5.0 on top of all the previous changes isolates the Bazel 7.0.0 changes causing the failure. ```txt $ bazel build //test/proto3:all ERROR: .../rules_scala/test/proto3/BUILD:14:14: ProtoScalaPBRule test/proto3/generated-proto-lib_jvm_extra_protobuf_generator_scalapb.srcjar failed: (Exit 1): scalapb_worker failed: error executing ProtoScalaPBRule command (from target //test/proto3:generated-proto-lib) bazel-out/darwin_arm64-opt-exec-ST-13d3ddad9198/bin/src/scala/scripts/scalapb_worker ... (remaining 2 arguments skipped) Could not find file in descriptor database: bazel-out/darwin_arm64-fastbuild/bin/test/proto3/generated.proto: No such file or directory java.lang.RuntimeException: Exit with code 1 at scala.sys.package$.error(package.scala:30) at scripts.ScalaPBWorker$.work(ScalaPBWorker.scala:44) at io.bazel.rulesscala.worker.Worker.persistentWorkerMain(Worker.java:96) at io.bazel.rulesscala.worker.Worker.workerMain(Worker.java:49) at scripts.ScalaPBWorker$.main(ScalaPBWorker.scala:39) at scripts.ScalaPBWorker.main(ScalaPBWorker.scala) Use --verbose_failures to see the command lines of failed build steps. ERROR: .../rules_scala/test/proto3/BUILD:14:14 scala @//test/proto3:generated-proto-lib failed: (Exit 1): scalapb_worker failed: error executing ProtoScalaPBRule command (from target //test/proto3:generated-proto-lib) bazel-out/darwin_arm64-opt-exec-ST-13d3ddad9198/bin/src/scala/scripts/scalapb_worker ... (remaining 2 arguments skipped) ``` * Fix WORKSPACE lint errors One of these days I'll remember to run `bazel run //tools:lint_fix` before opening a pull request. * Set --enable_workspace on last_green CI build Learning more about BuildKite every day. * Allow test_all.sh to pass under Bazel 7 `bash ./test_all.sh` passes after minor updates to the following test cases to handle slightly different behavior under Bazel 7: - `test_custom_reporter_class_has_been_set` makes the test output dir writable, since it no longer is by default. - `test_scala_import_expect_failure_on_missing_direct_deps_warn_mode` removes preexisting output, since Bazel 7 won't emit warnings if it already exists. --- FYI: Under Bazel 7.0 and Bazel 7.1, buildifier warnings for external targets in the following test cases changed to begin with `@@repo_name` instead of `@repo_name`: - test/shell/test_scala_library.sh: `test_scala_library_expect_failure_on_missing_direct_external_deps_jar` `test_scala_library_expect_failure_on_missing_direct_external_deps_file_group` - test/shell/test_strict_dependency.sh: `test_stamped_target_label_loading` `test_strict_deps_filter_included_target` - test/shell/test_test_unused_dependency.sh: `test_unused_deps_filter_included_target` However, as of Bazel 7.2, those warnings changed back to `@repo_name`, so those changes aren't reflected in this commit. * Bump to rules_go 0.39.1 for Bazel 7 compatibility `bazel run //tools:lint_{check,fix}` required updating rules_go to 0.39.1 to work under Bazel 7. The previous rules_go version 0.38.1 caused the following error: ```txt $ ./test_lint.sh ERROR: .../external/io_bazel_rules_go/BUILD.bazel:71:16: in (an implicit dependency) attribute of go_context_data rule @@io_bazel_rules_go//:go_context_data: in $whitelist_function_transition attribute of go_context_data rule @@io_bazel_rules_go//:go_context_data: package group '@@bazel_tools//tools/whitelists/function_transition_whitelist:function_transition_whitelist' is misplaced here (they are only allowed in the visibility attribute) ERROR: .../external/io_bazel_rules_go/BUILD.bazel:71:16: Analysis of target '@@io_bazel_rules_go//:go_context_data' failed ERROR: Analysis of target '//tools:lint_check' failed; build aborted: Analysis failed ``` * Bump to rules_java 7.9.0 for Bazel 7 compatibility Updated rules_java to 7.9.0 to fix errors of the following type under Bazel 7.2.1: ```txt ERROR: test/BUILD:640:10: While resolving toolchains for target //test:jar_lister (096dcc8): invalid registered toolchain '@remote_jdk8_linux_aarch64_toolchain_config_repo//:bootstrap_runtime_toolchain': no such target '@@remote_jdk8_linux_aarch64_toolchain_config_repo//:bootstrap_runtime_toolchain': target 'bootstrap_runtime_toolchain' not declared in package '' defined by .../external/remote_jdk8_linux_aarch64_toolchain_config_repo/BUILD.bazel ERROR: Analysis of target '//test:jar_lister' failed; build aborted ``` Also removed the seemingly unused `rules_java_extra` stanza from `WORKSPACE`. --- Though `rules_java` version 7.12.1 is available, and largely works with this repo when using Bazel 7.3.2, it requires a few temporary workarounds. (As I write this, 8.0.0 came out just a few hours ago and I haven't tried it.) Rather than commit the workarounds, upgrading only to 7.9.0 now seems less crufty. Though this commit only updates Bazel to 7.2.1, I suspect it's probably the same basic problem at play. What follows is a very detailed explanation of what happens with 7.12.1 with Bazel 7.3.2, just to have it on the record. --- The workaround is to change a few toolchain and macro file targets from `@bazel_tools//tools/jdk:` to `@rules_java//toolchains:`. This isn't a terribly bad or invasive workaround, but `@bazel_tools//tools/jdk:` is clearly the canonical path. Best to keep it that way, lest we build up technical debt. Without the workaround, these targets would fail: - //test/src/main/resources/java_sources:CompiledWithJava11 - //test/src/main/resources/java_sources:CompiledWithJava8 - //test/toolchains:java21_toolchain - //test:JunitRuntimePlatform - //test:JunitRuntimePlatform_test_runner - //test:scala_binary_jdk_11 with this error: ```txt ERROR: .../external/rules_java_builtin/toolchains/BUILD:254:14: While resolving toolchains for target @@rules_java_builtin//toolchains:platformclasspath (096dcc8): No matching toolchains found for types @@bazel_tools//tools/jdk:bootstrap_runtime_toolchain_type. ``` This appears to be a consequence of both upgrading the Bazel version to 7.3.2 and updating `rules_java` to 7.12.1. The `rules_java_builtin` repo is part of the `WORKSPACE` prefix that adds implicit dependencies: - https://bazel.build/external/migration#builtin-default-deps This repo was added to 7.0.0-pre.20231011.2 in the following change, mapped to `@rules_java` within the scope of the `@bazel_tools` repo: - bazelbuild/bazel: Add rules_java_builtin to the users of Java modules bazelbuild/bazel@ff1abb2 This change tried to ensure `rules_java` remained compatible with earlier Bazel versions. However, it changed all instances of `@bazel_tools//tools/jdk:bootstrap_runtime_toolchain_type` to `//toolchains:bootstrap_runtime_toolchain_type`: - bazelbuild/rules_java: Make rules_java backwards compatible with Bazel 6.3.0 bazelbuild/rules_java@30ecf3f Bazel has bumped `rules_java` in its `workspace_deps.bzl` from 7.9.0 to 7.11.0, but it's only available as of 8.0.0-pre.20240911.1. - bazelbuild/bazel: Update rules_java 7.11.1 / java_tools 13.8 bazelbuild/bazel#23571 bazelbuild/bazel@f92124a --- What I believe is happening is, under Bazel 7.3.2 and `rules_java` 7.12.1: - Bazel creates `rules_java` 7.9.0 as `@rules_java_builtin` in the `WORKSPACE` prefix. - `@bazel_tools` has `@rules_java` mapped to `@rules_java_builtin` when initialized during the `WORKSPACE` prefix, during which `@bazel_tools//tools/jdk` registers `alias()` targets to `@rules_java` toolchain targets. These aliased toolchains specify `@bazel_tools//tools/jdk:bootstrap_runtime_toolchain_type` in their `toolchains` attribute. - `WORKSPACE` loads `@rules_java` 7.12.1 and registers all its toolchains with type `@rules_java//toolchains:bootstrap_runtime_toolchain_type`. - Some `@rules_java` rules explicitly specifying toolchains from `@bazel_tools//tools/jdk` can't find them, because the `@bazel_tools//tools/jdk` toolchain aliases expect toolchains of type `@bazel_tools//tools/jdk:bootstrap_runtime_toolchain_type`. This has broken other projects in the same way: - bazelbuild/bazel: [Bazel CI] Downstream project broken by rules_java upgrade #23619 bazelbuild/bazel#23619 These problems don't appear under Bzlmod, whereby `@rules_java_builtin` was never required. This is because `WORKSPACE` executes its statements sequentially, while Bzlmod builds the module dependency graph _before_ instantiating repositories (within module extensions). It seems a fix is on the way that removes `@rules_java_builtin` from the `WORKSPACE` prefix, and adds `@rules_java` to the suffix. At this moment, though, it's not even in a prerelease: - bazelbuild/bazel: Remove rules_java_builtin in WORKSPACE prefix bazelbuild/bazel@7506690 --- Note that the error message matches that from the following resolved issue, but that issue was for non-Bzlmod child repos when `WORKSPACE` was disabled. - bazelbuild/bazel: Undefined @@rules_java_builtin repository with --noenable_workspace option bazelbuild/bazel#22754
* scala 3.4.2 support * scala 3.5.0 support * removed unnecessary artifact for scala versions < 3.4.0 * added artifacts for scala versions >= 3.4.0 * 3.4.3 hotfix
Reverts #1349 This check should be optional however it fails all builds because bazel info fails Error is related to workspace setup Since we are in process of moving to bzlmod I think we should not bother with that now. Especially when reproducing locally it gives different error.
Part of #1482. Bumps every Scala version up to use protobuf-java:4.28.2, to preemptively avoid the following error produced by upcoming Scalafmt updates: ```txt $ bazel test --repo_env=SCALA_VERSION=2.11.12 //test/scalafmt/... INFO: Analyzed 9 targets (80 packages loaded, 3307 targets configured). ERROR: .../test/scalafmt/BUILD:43:20: ScalaFmt test/scalafmt/test/scalafmt/formatted/formatted-test.scala.fmt.output failed: Worker process did not return a WorkResponse: ---8<---8<--- Start of log, file at .../bazel-workers/worker-134-ScalaFmt.log ---8<---8<--- Exception in thread "main" java.lang.NoSuchMethodError: 'boolean com.google.protobuf.GeneratedMessageV3.isStringEmpty(java.lang.Object)' [ ...snip... ] ``` This issue seemed to suggest a library bump may fix it, and it did: - protocolbuffers/protobuf#9236 Fortunately, even though this is a major version bump from 3.10.0 to 4.28.2, there were no compatibility issues, per: > Protobuf major version releases may also be backwards-compatible with > the last release of the previous major version. See the release notice > for more details. > > - https://github.com/protocolbuffers/protobuf/tree/main/java#compatibility-notice
* Update test scripts for Bzlmod compatibility Part of #1482. These are mostly small changes to make test assertions more flexible between `WORKSPACE` and Bzlmod runs. For Bzlmod runs: - Fixed `test_scala_config_content` from `test_scala_config.sh` by changing a path from `external/io_bazel_rules_scala_config` to `external/*io_bazel_rules_scala_config`. - Fixed a number of tests by updating expected output messages to allow them to start with either `@//` or `@@//`. - Fixed `test_stamped_target_label_loading` from `test/shell/test_strict_dependency.sh` by accommodating the canonical `io_bazel_rules_scala_guava` repo name. Also allows for the optional current Scala version suffix. Also made these other important changes: - Updated all the assertions in `test_helper.sh` to use Bash builtin regex matching via `_expect_failure_with_messages` instead of `grep`. This allows the expected message patterns to use full regular expressions while avoiding forking a new process. This new function helped reduce duplication in that file at the same time. - Added `--repo_env="SCALA_VERSION=..."` to each test script called from `./test_coverage.sh`, and set `SCALA_VERSION` to 2.12.19 in each of these files. Using other Scala versions technically works, but the output is slightly different, causing the `diff` commands in the test cases to fail. - Updated `test_version.sh` to copy the top level `.bazelversion` file into its test repo. - Changed how `test_version.sh` handles injecting `twitter_scrooge` repos into the `WORKSPACE` file. This will make it easy to do the equivalent for `MODULE.bazel` when the time comes. Also includes a few minor, opportunistic formatting cleanups. * Add SCALA_VERSION comment to test_coverage_*.sh After @simuons asked about an apparently missing `--repo_env` setting in `test_coverage_scalatest.sh` in #1622, I realized the test case missing it used `grep`, not `diff`. Rather than add the flag where it wasn't needed, I commented the local `SCALA_VERSION` setting in each file. Now that I think about it, the proper solution is to generate coverage files for each Scala version, or for ranges of scala versions. Then we can remove this `SCALA_VERSION` and `--repo_env` setting completely. I'll look into that as a side quest.
* Bump to rules_proto 6.0.2, separate protobuf 21.7 This is in preparation for adding a `MODULE.bazel` file, which will bring in dependencies that use rules_proto 6.x. This change avoids that warning, and changing it now ensures that `WORKSPACE` builds are still compatible. As explained in the 6.0.0 release notes, rules_proto 6.x no longer depends directly on com_google_protobuf, so we need to import it separately. We're keeping the Protobuf version as 21.7 for now, because Protobuf 22.x requires C++14 at a minimum, which Bazel 6.x doesn't support by default: - https://protobuf.dev/news/v22/#c11-support - https://github.com/protocolbuffers/protobuf/releases/tag/v22.0 The downside is that, unline rules_proto-5.3.0-21.7, we now end up building (and occasionally rebuilding) `protoc` as part of our build. However, this also enables clients to try "Protobuf Toolchainization" to download precompiled `protoc` binaries. - https://github.com/bazelbuild/rules_proto/releases/tag/6.0.0 - https://docs.google.com/document/d/1CE6wJHNfKbUPBr7-mmk_0Yo3a4TaqcTPE0OWNuQkhPs/edit For now, Protobuf Toolchainization requires Bazel 6.5.0 or Bazel >= 7 and the `--incompatible_enable_proto_toolchain_resolution` flag: - https://bazel.build/reference/command-line-reference#flag--incompatible_enable_proto_toolchain_resolution I've got a working draft implementation in a separate branch, though that experiment is certainly not urgent. * Update check for existing com_google_protobuf I'd previously checked for an existing `protobuf` rule, not `com_google_protobuf`. D'oh!
* Use custom repo rule to generate @scalafmt_default Part of #1482. Replaces `native.new_local_repository` in `scalafmt_default_config` with the new `scalafmt_config` repo rule. Resolves an incompatibility between Bazel 6.5.0 and 7.3.2, while creating a much smaller `@scalafmt_default` repo. Also updates the `native.register_toolchains` call with a stringified `Label` to remove the hardcoding of `@io_bazel_rules_scala`. The problem is that under Bazel 7.3.2, calling `scalafmt_default_config` from a module extension produces this error: ```txt $ bazel test //test/scalafmt/... ERROR: Traceback (most recent call last): File ".../scala/extensions/deps.bzl", line 218, column 32, in _scala_deps_impl scalafmt_default_config() File ".../scala/scalafmt/scalafmt_repositories.bzl", line 18, column 32, in scalafmt_default_config native.new_local_repository( Error in new_local_repository: The native module can be accessed only from a BUILD thread. Wrap the function in a macro and call it from a BUILD file ``` Ostensibly the solution is to replace `native.new_local_repository` with: ```py load( "@bazel_tools//tools/build_defs/repo:local.bzl", "new_local_repository", ) ``` However, `local.bzl` isn't available in Bazel 6.5.0: ```txt $ bazel test //test/scalafmt/... ERROR: Error computing the main repository mapping: at .../scala/scalafmt/scalafmt_repositories.bzl:8:6: cannot load '@bazel_tools//tools/build_defs/repo:local.bzl': no such file ``` The new `scalafmt_config` repository rule works under both Bazel 6.5.0 and 7.3.2. Also, it symlinks only the single Scalafmt config file, instead of every top level file and directory in the project, as `new_local_repository` did. * Fix scalafmt_repositories.bzl lint errors Broke CI again. D'oh!
Part of #1482. Avoids a future Scala 3.5 compatibility issue under Bzlmod. PR #1604 added Scala 3.5 suppport. I updated my Bzlmod working branch to include those changes, and updated `scala_3_5.bzl` with dependencies I'd added or updated from `scala_3_4.bzl`. `./test_examples.sh` then failed with the following error, which I recreated in the local `examples/scala3` repo: ```txt $ bazel build --repo_env=SCALA_VERSION=3.5.0 //... ERROR: .../external/rules_scala~/third_party/dependency_analyzer/src/main/BUILD:4:39: scala @@rules_scala~//third_party/dependency_analyzer/src/main:dependency_analyzer failed: (Exit 1): scalac_bootstrap failed: error executing Scalac command (from target @@rules_scala~//third_party/dependency_analyzer/src/main:dependency_analyzer) bazel-out/.../bin/external/rules_scala~/src/java/io/bazel/rulesscala/scalac/scalac_bootstrap ... (remaining 1 argument skipped) -- [E164] Declaration Error: external/rules_scala~/third_party/dependency_analyzer/src/main/io/bazel/rulesscala/dependencyanalyzer3/DependencyAnalyzer.scala:21:6 21 | def init(options: List[String]): List[PluginPhase] = | ^ |error overriding method init in trait StandardPlugin of type (options: List[String]): List[dotty.tools.dotc.plugins.PluginPhase]; | method init of type (options: List[String]): List[dotty.tools.dotc.plugins.PluginPhase] needs `override` modifier ``` This may be because I bumped `io_bazel_rules_scala_scala_library_2` from `org.scala-lang:scala-library:2.13.12` to 2.13.14 in my Bzlmod branch. That's my guess based on information about the `dotty` packages from the following files: - `third_party/utils/src/test/io/bazel/rulesscala/utils/Scala3CompilerUtils.scala` - `third_party/dependency_analyzer/src/test/analyzer_test_scala_3.bzl` It's interesting that Scala 3.{1,2,3,4} all passed with that `scala-library` version bump, but without this change, but I can't explain why.
* Add ProtoReporter and DepsTrackingReporter for Scala3. Collect diagnostics in Scala 3 * Lint + rename ConsoleReporter -> BazelConsoleReporter to disambigiuate with dotc.reporting.ConsoleReporter * Revert spuriously commented out tests in `test_version.sh`
* Add repo name macros for Bzlmod compatibility Part of #1482. These helper macros fix various repository name related errors when building under Bzlmod, while remaining backwards compatible with `WORKSPACE`. Without Bzlmod, these macros return original repo names. With Bzlmod enabled, they avoid the problems described below. I've prepared a change for bazelbuild/bazel-skylib containing these macros with full unit tests. If the maintainers accept that change, we can bump our bazel_skylib version to use the macros from there, and remove the `bzlmod.bzl` file. --- Also includes a couple of other minor touch-ups: - Updated the `runtime_deps` attribute in `repositories()` to add the Scala version suffix, just like `deps`. - Added a `fail()` message to `repositories()` to make it more clear which Scala version dictionary is missing an artifact. - Removed unnecessary internal uses of the `@io_bazel_rules_scala` repo name, applying `Label()` where necessary. - Updated the construction of `dep_providers` in `_default_dep_providers` to remove the repo name, reduce duplication, and make the upcoming toolchain update slightly cleaner. --- Before this change, `repositories()` would originally emit `BUILD` targets including canonical repo names: ```py scala_import( name = "_main~scala_deps~io_bazel_rules_scala_scala_compiler", jars = ["scala-compiler-2.12.18.jar"], ) ``` resulting in errors like: ```txt ERROR: .../_main~_repo_rules~io_bazel_rules_scala/scala/BUILD: no such target '@@_main~scala_deps~io_bazel_rules_scala_scala_compiler//:io_bazel_rules_scala_scala_compiler': target 'io_bazel_rules_scala_scala_compiler' not declared in package '' defined by .../_main~scala_deps~io_bazel_rules_scala_scala_compiler/BUILD and referenced by '@@_main~_repo_rules~io_bazel_rules_scala//scala:default_toolchain_scala_compile_classpath_provider' ``` --- Attaching resources from custom repos to targets under Bzlmod, like in `scalarules.test.resources.ScalaLibResourcesFromExternalDepTest`, would break with: ```txt $ bazel test //test/src/main/scala/scalarules/test/resources:all 1) Scala library depending on resources from external resource-only jar::allow to load resources(scalarules.test.resources.ScalaLibResourcesFromExternalDepTest) java.lang.NullPointerException at scalarules.test.resources.ScalaLibResourcesFromExternalDepTest.get(ScalaLibResourcesFromExternalDepTest.scala:17) at scalarules.test.resources.ScalaLibResourcesFromExternalDepTest.$anonfun$new$3(ScalaLibResourcesFromExternalDepTest.scala:11) at scalarules.test.resources.ScalaLibResourcesFromExternalDepTest.$anonfun$new$2(ScalaLibResourcesFromExternalDepTest.scala:11) ``` `_update_external_target_path` in `resources.bzl` fixes this problem. --- Fixes `test_strict_deps_filter_included_target` from `test/shell/test_strict_dependency.sh` when run under Bzlmod. The `dependency_tracking_strict_deps_patterns` attribute of //test_expect_failure/missing_direct_deps/filtering:plus_one_strict_deps_filter_a_impl contains patterns starting with `@//`. However, in `_phase_dependency()` from `scala/private/phases/phase_dependency.bzl`, these patterns were compared against a stringified Label. Under Bazel < 7.1.0, this works for root target Labels. Under Bazel >= 7.1.0, this breaks for root target Labels under Bzlmod, which start with `@@//`. `adjust_main_repo_prefix` updates the patterns accordingly in `_partition_patterns` from `scala_toolchain.bzl`. `apparent_repo_label_string` makes `_phase_dependency()` more resilient when comparing target Labels against filters containing external apparent repo names. --- Fixes the `alias` targets generated by `_jvm_import_external` from `scala_maven_import_external.bzl` by setting the `target` to the correct apparent repo name. Added `apparent_repo_name(repository_ctx.name)` to `_jvm_import_external` to avoid this familiar error when running `dt_patches/test_dt_patches` tests: ```txt $ bazel build //... ERROR: .../external/_main~compiler_source_repos~scala_reflect/BUILD: no such target '@@_main~compiler_source_repos~scala_reflect//:scala_reflect': target 'scala_reflect' not declared in package '' defined by .../external/_main~compiler_source_repos~scala_reflect/BUILD ERROR: .../dt_patches/test_dt_patches/BUILD:11:22: no such target '@@_main~compiler_source_repos~scala_reflect//:scala_reflect': target 'scala_reflect' not declared in package '' defined by .../external/_main~compiler_source_repos~scala_reflect/BUILD and referenced by '//:dt_scala_toolchain_scala_compile_classpath_provider' ERROR: Analysis of target '//:dt_scala_toolchain_scala_compile_classpath_provider' failed; build aborted: Analysis failed ``` --- As for why we need these macros, we can't rely on hacking the specific canonical repository name format: > Repos generated by extensions have canonical names in the form of > `module_repo_canonical_name+extension_name+repo_name`. Note that the > canonical name format is not an API you should depend on — it's > subject to change at any time. > > - https://bazel.build/external/extension#repository_names_and_visibility The change to no longer encode module versions in canonical repo names in Bazel 7.1.0 is a recent example of Bazel maintainers altering the format: - bazelbuild/bazel#21316 And the maintainers recently replaced the `~` delimiter with `+` in the upcoming Bazel 8 release due to build performance issues on Windows: - bazelbuild/bazel#22865 The core `apparent_repo_name` function assumes the only valid repo name characters are letters, numbers, '_', '-', and '.'. This is valid so long as this condition holds: - https://github.com/bazelbuild/bazel/blob/7.3.2/src/main/java/com/google/devtools/build/lib/cmdline/RepositoryName.java#L159-L162 * Bazel 5 updates from bazelbuild/bazel-skylib#548 I don't think we really need them, as I don't think we support Bazel 5. But better to have and not need, I guess. * Fix _MAIN_REPO_PREFIX in bzlmod.bzl Backported from bazelbuild/bazel-skylib#548, whereby I realized `Label(//:all)` would not produce the main repo prefix when imported into other repos. * Expand dependency tracking patterns using Label Replaced `apparent_repo_label_string` and `adjust_main_repo_prefix` based on an idea from @fmeum during his review of bazelbuild/bazel-skylib#548. Added `_expand_patterns`, which uses `native.package_relative_label` to expand the `dependency_tracking_*_deps_patterns` attributes to full, correct `Label` strings. All `test/shell/test_{strict,unused}_dependency.sh` test cases pass. * Fix `scala_toolchains` lint error * Use `apparent_repo_name` only on `repository_ctx` Repurposed `apparent_repo_name` to only work for `repository_ctx` objects, not repository or module names in general. Removed `generated_rule_name` from `repositories.bzl`, since it's no longer necessary. Technically we could eliminate `apparent_repo_name` by making `generated_rule_name` a mandatory attribute of `_jvm_import_external`. However, this feels ultimately clunky and unnecessary. This update to `apparent_repo_name` required removing `_update_external_target_path` and updating `_target_path_by_default_prefixes` to remove `external/<canonical_repo_name>` prefixes. This represents a breaking change for files referencing `external/<repo_name>` paths, but the quick fix is to delete that prefix in the code. This matches the behavior in the same function regarding `resources/` and `java/` prefixes. * Update `_jvm_import_external` JAR alias target Changes the target for the `//jar:jar` alias in `_jvm_import_scala` to the top level repo target directly (`//:%s`) instead of the repo name (`@%s`). Functionally the same, but seems a bit cleaner than referencing the target as though it were external to the repo. * Fix typo in apparent_repo_name comment Caught by @simuons in #1621, thankfully. I hate sloppy comments, and hate it more when I write them!
Added a `catch` case for `Throwable` in `handleCodeGeneratorRequest`. Part of #1482. Not catching this caused `ProtoScalaPBRule` workers to hang when building `//test/...` after bumping com_google_protobuf from 25.5 to 27.5. With this change, we can now see the error: ```txt $ USE_BAZEL_VERSION=7.3.2 bazel build //test/... ERROR: .../test/proto/BUILD:165:14: ProtoScalaPBRule test/proto/standalone_proto_strip_import_prefix_package_jvm_extra_protobuf_generator_scalapb.srcjar failed: (Exit 1): scalapb_worker failed: error executing ProtoScalaPBRule command (from target //test/proto:standalone_proto_strip_import_prefix_package) bazel-bin/src/scala/scripts/scalapb_worker ... (remaining 2 arguments skipped) --jvm_extra_protobuf_generator_out: java.lang.NoSuchMethodError: 'java.lang.Object com.google.protobuf.DescriptorProtos$MessageOptions.getExtension(com.google.protobuf.GeneratedMessage$GeneratedExtension)' at scalapb.compiler.DescriptorImplicits$ExtendedMessageDescriptor.messageOptions(DescriptorImplicits.scala:532) at scalapb.compiler.DescriptorImplicits$ExtendedMessageDescriptor.sealedOneOfExtendsCount(DescriptorImplicits.scala:578) at scalapb.compiler.ProtoValidation.validateMessage(ProtoValidation.scala:107) at scalapb.compiler.ProtoValidation.$anonfun$validateFile$2(ProtoValidation.scala:17) at scalapb.compiler.ProtoValidation.$anonfun$validateFile$2$adapted(ProtoValidation.scala:17) at scala.collection.Iterator.foreach(Iterator.scala:943) at scala.collection.Iterator.foreach$(Iterator.scala:943) at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) at scala.collection.IterableLike.foreach(IterableLike.scala:74) at scala.collection.IterableLike.foreach$(IterableLike.scala:73) at scala.collection.AbstractIterable.foreach(Iterable.scala:56) at scalapb.compiler.ProtoValidation.validateFile(ProtoValidation.scala:17) at scalapb.compiler.ProtoValidation.$anonfun$validateFiles$1(ProtoValidation.scala:10) at scalapb.compiler.ProtoValidation.$anonfun$validateFiles$1$adapted(ProtoValidation.scala:10) at scala.collection.immutable.Stream.foreach(Stream.scala:533) at scalapb.compiler.ProtoValidation.validateFiles(ProtoValidation.scala:10) at scalarules.test.extra_protobuf_generator.ExtraProtobufGenerator$.handleCodeGeneratorRequest(ExtraProtobufGenerator.scala:65) at scalarules.test.extra_protobuf_generator.ExtraProtobufGenerator$.run(ExtraProtobufGenerator.scala:47) at protocbridge.frontend.PluginFrontend$.$anonfun$runWithBytes$1(PluginFrontend.scala:51) at scala.util.Try$.apply(Try.scala:213) at protocbridge.frontend.PluginFrontend$.runWithBytes(PluginFrontend.scala:51) at protocbridge.frontend.PluginFrontend$.runWithInputStream(PluginFrontend.scala:121) at protocbridge.frontend.PosixPluginFrontend$.$anonfun$prepare$2(PosixPluginFrontend.scala:40) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$1$$anon$2.block(ExecutionContextImpl.scala:75) at java.base/java.util.concurrent.ForkJoinPool.managedBlock(ForkJoinPool.java:3118) at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$1.blockOn(ExecutionContextImpl.scala:87) at scala.concurrent.package$.blocking(package.scala:146) at protocbridge.frontend.PosixPluginFrontend$.$anonfun$prepare$1(PosixPluginFrontend.scala:38) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659) at scala.util.Success.$anonfun$map$1(Try.scala:255) at scala.util.Success.map(Try.scala:213) at scala.concurrent.Future.$anonfun$map$1(Future.scala:292) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:42) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:74) at java.base/java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1426) at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290) at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020) at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656) at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594) at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183) java.lang.RuntimeException: Exit with code 1 at scala.sys.package$.error(package.scala:30) at scripts.ScalaPBWorker$.work(ScalaPBWorker.scala:44) at io.bazel.rulesscala.worker.Worker.persistentWorkerMain(Worker.java:96) at io.bazel.rulesscala.worker.Worker.workerMain(Worker.java:49) at scripts.ScalaPBWorker$.main(ScalaPBWorker.scala:39) at scripts.ScalaPBWorker.main(ScalaPBWorker.scala) ERROR: .../test/proto/BUILD:165:14 scala @//test/proto:standalone_proto_strip_import_prefix_package failed: (Exit 1): scalapb_worker failed: error executing ProtoScalaPBRule command (from target //test/proto:standalone_proto_strip_import_prefix_package) bazel-out/darwin_arm64-opt-exec-ST-a828a81199fe/bin/src/scala/scripts/scalapb_worker ... (remaining 2 arguments skipped) ``` The fix for this is to bump ScalaPB jar versions to 1.0.0-alpha.1, which will come in a future change.
* Add tools/sync-bazelversion.sh, commit results Ensures Bazelisk uses the same Bazel version in every workspace in the project. Part of #1482. I discovered that the workspaces under the main workspace were running the latest mainline Blaze release I had installed, instead of 6.5.0 from `.bazelversion`. Now all the workspaces are synchronized on the same Bazel version. As noted in the `tools/sync-bazelversion.sh` comments, symlinks or an `import ../.bazelversion` mechanism would be preferable. However, the former can surprise Windows users, and Bazelisk doesn't support the latter. * Move sync-bazelversion.sh from tools/ to scripts/ Requested by @simuons in #1629.
* Add scala.tools.cmd.CommandLineParser for >=2.13.9 Fixes //third_party/utils/src/test:test_util for Scala 2.13.9 and later by providing a thin `scala.tools.cmd.CommandLineParser` adapter. Before this change, building under Scala 2.13.14 produced: ```txt $ bazel build --repo_env=SCALA_VERSION=2.13.14 \ //third_party/utils/src/test:test_util ERROR: .../third_party/utils/src/test/BUILD:6:14: scala @@//third_party/utils/src/test:test_util failed: (Exit 1): scalac failed: error executing Scalac command (from target //third_party/utils/src/test:test_util) bazel-out/darwin_arm64-opt-exec-ST-d57f47055a04/bin/src/java/io/bazel/rulesscala/scalac/scalac @bazel-out/darwin_arm64-fastbuild/bin/third_party/utils/src/test/test_util.jar-0.params third_party/utils/src/test/io/bazel/rulesscala/utils/TestUtil.scala:10: error: object cmd is not a member of package tools import scala.tools.cmd.CommandLineParser ^ third_party/utils/src/test/io/bazel/rulesscala/utils/TestUtil.scala:119: error: not found: value CommandLineParser val options = CommandLineParser.tokenize(compileOptions) ^ ``` Turns out `CommandLineParser` disappered in Scala 2.13.9, and its `Parser` replacement is private: - scala/scala#10057 * Use local proxy instead of scala.tools.cmd patch Copied the `CompilerAPICompat` code verbatim from @WojciechMazur's suggestion on #1634. Definitely seems more robust than injecting `CommandLineParser` directly into `scala.tools.cmd`. * Convert CompilerAPICompat trait to adapter object Requested by @simuons in #1634. Renamed it as well to reflect its very specific function.
* Update latest Scala 2 and Scala 3 versions * Update changed versions used in tests * Fix failing Scala 3 diagnostics tests after upgrade to 3.3.4 - this version backported propositions for missing identifiers
* Create the `@scala_compiler_sources` repo Contains aliases to versioned Scala compiler source repository targets. Part of #1482. Updates the version specific repo references in the srcs attribute of `//third_party/dependency_analyzer/src/main/io/bazel/rulesscala/dependencyanalyzer/compiler:dep_reporting_compiler`. Now these are references to versioned targets in `@scala_compiler_sources//`, which are aliases to those versioned compiler source repos. In a Bzlmod world, this enables `rules_scala` to import only the `scala_compiler_sources` repo in `MODULE.bazel`, instead of importing each individual versioned compiler source repo. This then allows `rules_scala` clients to set multiple `SCALA_VERSIONS` without requiring them to import this repo or any versioned compiler source repo. The Bzlmodifcation of the test repos under `dt_patches` (coming in a future change) revealed the need for this flexibility. * Push select() into @scala_compiler_sources//:src Requested by @simuons in #1635.
* Update to Scalafmt 3.8.3 Ensures that all Scalafmt targets succeed under every supported Scala version. Part of #1482. Scala 2.11 remains on Scalafmt 2.7.5, since there's not a more recent compatible version. Between 3.0.0 and 3.8.3, Scalafmt's `FileOps` moved packages, `Config.fromHoconFile` moved to `ScalafmtConfig`, and the methods now take a `java.nio.file.Path` parameter. As a result, this required extracting a thin `ScalafmtAdapter` object, with one implementation for Scala 2.11 and Scalafmt 2.7.5, and one for Scalafmt 3.8.3. This change also adds the file path to the `Scalafmt.format()` call, allowing error messages to show the actual file path instead of `<input>`. Removed the `verticalMultiline.newlineBeforeImplicitKW = true` and `unindentTopLevelOperators = false` options from `.scalafmt.conf`. They don't appear to be available in Scalafmt 3.8.3. Also removes some `@io_bazel_rules_scala` references to make the internal implementation less dependendent on that name. This will allow Bzlmod clients to use `rules_scala` in a `bazel_dep()` without setting `repo_name = "io_bazel_rules_scala"`. --- Scalafmt 3.0.0 works with the current default Scala version 2.12.19, but breaks under Scala 2.13.14: ```txt $ bazel test --repo_env=SCALA_VERSION=2.13.14 //test/scalafmt/... ERROR: .../test/scalafmt/BUILD:49:20: ScalaFmt test/scalafmt/test/scalafmt/unformatted/unformatted-test.scala.fmt.output failed: (Exit 1): scalafmt failed: error executing command (from target //test/scalafmt:unformatted-test) bazel-out/darwin_arm64-opt-exec-2B5CBBC6/bin/scala/scalafmt/scalafmt '--jvm_flag=-Dfile.encoding=UTF-8' ... (remaining 1 argument skipped) java.util.NoSuchElementException: last of empty IndexedSeq at scala.collection.IndexedSeqOps.last(IndexedSeq.scala:110) at scala.collection.IndexedSeqOps.last$(IndexedSeq.scala:105) at scala.meta.tokens.Tokens.last(Tokens.scala:29) at org.scalafmt.internal.Router.$anonfun$getSplitsImpl$90(Router.scala:707) at scala.Option.map(Option.scala:242) at org.scalafmt.internal.Router.getSplitsImpl(Router.scala:706) at org.scalafmt.internal.Router.getSplits(Router.scala:2314) at org.scalafmt.internal.BestFirstSearch.$anonfun$routes$1(BestFirstSearch.scala:38) [ ...snip... ] ``` This matches: - scala/community-build#1680 Which mentions apparent fixes in: - scalameta/scalameta#3235 - scalameta/scalafmt#3581 So the fix was to upgrade the Scalafmt version. That said, I held its Scalameta dependencies to 4.9.9 per the following link, even though 4.10.2 is out now. - https://mvnrepository.com/artifact/org.scalameta/scalafmt-core_2.13/3.8.3 --- This change updates Scalameta to version 4.9.9 because between 4.9.9 and 4.10.2, Scalafmt breaks Scala 3 file formatting by unindenting some code that it shouldn't. Or, our usage of it breaks somehow; I can't find any open or closed issues in the Scalameta project that matches what happes in rules_scala. (Perhaps I can file one eventually.) - https://github.com/scalameta/scalafmt/issues The solution was to keep the Scalameta dependencies at 4.9.9. FWIW, this was one of the most time consuming bugs to pinpoint and rectify in the entire Bzlmodification process. This was the failing command inside `test_cross_version`: ```txt $ bazel run //scalafmt:unformatted-test3.format-test INFO: Analyzed target //scalafmt:unformatted-test3.format-test (0 packages loaded, 0 targets configured). INFO: From ScalaFmt scalafmt/scalafmt/unformatted/unformatted-test3.scala.fmt.output: ``` The `test_cross_version/scalafmt/unformatted/unformatted-test3.scala` file formatted by this test target looks like this: ```scala import org.scalatest.flatspec._ class Test extends AnyFlatSpec: "Test" should "be formatted" in { assert(true) } ``` I hacked `ScalaWorker.format()` to print the `code` variable, and could see that after the first `Scalafmt.format()` pass, the code looks like this: ```scala import org.scalatest.flatspec._ class Test extends AnyFlatSpec: "Test" should "be formatted" in { assert(true) } ``` Since the result doesn't match the original code, it tries to call `Scalafmt.format()` again on this "formatted" code with the incorrect indentation. That's when we get the following, which doesn't look anything like the original file: ```txt Unable to format file due to bug in scalafmt scalafmt/unformatted/unformatted-test3.scala:3: error: [dialect scala3 [with overrides]] `;` expected but `:` found class Test extends AnyFlatSpec: ^ ``` --- As it turns out, bumping to com.google.protobuf:protobuf-java:4.28.2 alone breaks the bump to Scalafmt 3.8.3. Then bumping to rules_proto 6.0.2, with the separate protobuf v21.7, fixes it, presumably by recompiling `protoc`. The in-between breakage happened in the `test_cross_build` project: ```txt $ bazel build //scalafmt:formatted-binary2 INFO: Analyzed target //scalafmt:formatted-binary2 (4 packages loaded, 64 targets configured). ERROR: .../test_cross_build/scalafmt/BUILD:59:22: ScalaFmt scalafmt/scalafmt/formatted/formatted-binary2.scala.fmt.output failed: Worker process did not return a WorkResponse: ---8<---8<--- Start of log, file at .../bazel-workers/worker-3-ScalaFmt.log ---8<---8<--- Exception in thread "main" java.lang.NoSuchMethodError: 'void com.google.devtools.build.lib.worker.WorkerProtocol$WorkRequest.makeExtensionsImmutable()' at com.google.devtools.build.lib.worker.WorkerProtocol$WorkRequest.<init>(WorkerProtocol.java:1029) at com.google.devtools.build.lib.worker.WorkerProtocol$WorkRequest.<init>(WorkerProtocol.java:922) at com.google.devtools.build.lib.worker.WorkerProtocol$WorkRequest$1.parsePartialFrom(WorkerProtocol.java:2482) at com.google.devtools.build.lib.worker.WorkerProtocol$WorkRequest$1.parsePartialFrom(WorkerProtocol.java:2476) at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:192) at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(AbstractParser.java:232) at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:244) at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:249) at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:25) at com.google.protobuf.GeneratedMessage.parseDelimitedWithIOException(GeneratedMessage.java:367) at com.google.devtools.build.lib.worker.WorkerProtocol$WorkRequest.parseDelimitedFrom(WorkerProtocol.java:1438) at io.bazel.rulesscala.worker.Worker.persistentWorkerMain(Worker.java:81) at io.bazel.rulesscala.worker.Worker.workerMain(Worker.java:49) at io.bazel.rules_scala.scalafmt.ScalafmtWorker$.main(ScalafmtWorker.scala:12) at io.bazel.rules_scala.scalafmt.ScalafmtWorker.main(ScalafmtWorker.scala) ---8<---8<--- End of log ---8<---8<--- Target //scalafmt:formatted-binary2 failed to build ``` * Add attributes to .scalafmt.conf files Per suggestions from @WojciechMazur in #1631. * Bump Scala 2 libs after #1636 Brings all the Scalafmt 3.8.3 updates from #1631 up to date with the Scala version bumps from #1636.
Updates targets broken under the Scala 2.13.0 test. Also removes the redundant 2.12.1 test, and cleans the `test_dt_patches_user_srcjar` repo before the `test_compiler_srcjar{,_nonhermetic}` tests. Part of #1482. This fix and the `.bazelversion` sync from #1629 can land in either order. Both are required for `dt_patches/dt_patch_test.sh` to be fully functional. --- The equivalent standalone command in `dt_patches/test_dt_patches` produced the error: ```txt $ bazel build //... --repo_env=SCALA_VERSION=2.13.0 //... ERROR: .../rules_scala~/src/java/io/bazel/rulesscala/scalac/reporter/BUILD:5:13: Building external/rules_scala~/src/java/io/bazel/rulesscala/scalac/reporter/libreporter.jar (2 source files) [for tool] failed: (Exit 1): java failed: error executing Javac command (from target @@rules_scala~//src/java/io/bazel/rulesscala/scalac/reporter:reporter) external/rules_java~~toolchains~remotejdk21_macos_aarch64/bin/java '--add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED' '--add-exports=jdk.compiler/com.sun.tools.javac.main=ALL-UNNAMED' ... (remaining 19 arguments skipped) external/rules_scala~/src/java/io/bazel/rulesscala/scalac/deps_tracking_reporter/ after_2_12_13_and_before_2_13_12/DepsTrackingReporter.java:85: error: method does not override or implement a method from a supertype @OverRide ^ external/rules_scala~/src/java/io/bazel/rulesscala/scalac/deps_tracking_reporter/ after_2_12_13_and_before_2_13_12/DepsTrackingReporter.java:94: error: cannot find symbol ((FilteringReporter) delegateReporter).doReport(pos, msg, severity); ^ symbol: method doReport(Position,String,Reporter.Severity) location: interface FilteringReporter external/rules_scala~/src/java/io/bazel/rulesscala/scalac/deps_tracking_reporter/after_2_12_13_and_before_2_13_12/DepsTrackingReporter.java:99: error: cannot find symbol super.doReport(pos, msg, severity); ^ symbol: method doReport(Position,String,Reporter.Severity) external/rules_scala~/src/java/io/bazel/rulesscala/scalac/reporter/after_2_12_13_and_before_2_13_12/ProtoReporter.java:49: error: method does not override or implement a method from a supertype @OverRide ^ external/rules_scala~/src/java/io/bazel/rulesscala/scalac/reporter/after_2_12_13_and_before_2_13_12/ProtoReporter.java:51: error: cannot find symbol super.doReport(pos, msg, severity); ^ symbol: method doReport(Position,String,Reporter.Severity) Target //dummy:dummy failed to build ``` I had the thought that maybe 2.13.0 could use the same `srcs` as the `before_2_12_13` targets. So I abused the `any` matcher to pick exactly 2.13.0 and assign it the same values as `before_2_12_13`. Now it works, and `dt_patches/dt_patch_test.sh` fully passes. --- Cleaning the `test_dt_patches_user_srcjar` repository helps ensure that the `test_compiler_srcjar_nonhermetic` runs, in particular, don't fail after the first run.
* First create_repository.py refactoring Preserves the behavior of the previous version in preparation for further improvements and eventual behavior changes. Part of #1482. Tested by running the original script, calling `git stash third_party/...`, then running the updated script. Confirmed that the updated script produced no new changes in the working tree compared to the indexed changes. * Second create_repository.py refactoring Preserves the behavior of the previous version in preparation for further improvements and eventual behavior changes. Part of #1482. This change eliminates a lot of duplication, and replaces the previous functions to add trailing commas to the output with a `re.sub()` call. Tested by running the original script, calling `git stash third_party/...`, then running the updated script. Confirmed that the updated script produced no new changes in the working tree compared to the indexed changes. Also: - Added `#!/usr/bin/env python3` and set file mode 0755 to enable running the script directly instead of as an argument to `python3`. - Used `pathlib.Path` on `__file__` to enable running the script from any directory, not just from inside `scripts/`. - Updated `scripts/README.md` to reflect some of the script changes, and to improve the Markdown formatting in general. - Added the `DOWNLOADED_ARTIFACTS_FILE` variable, added its value to `.gitignore`, and added a `Path.unlink()` at the end of the script to clean it up. * Third create_repository.py refactoring Preserves the behavior of the previous version in preparation for further improvements and eventual behavior changes. Part of #1482. This vastly improves performance by _not_ downloading and hashing artifacts that are already present in the current configuration. Added `PROTOBUF_JAVA_VERSION = "4.28.2"` as a core artifact to avoid downgrades, and to signal the importance of this particular artifact. Tested by running the original script, calling `git stash third_party/...`, then running the updated script. Confirmed that the updated script produced no new changes in the working tree compared to the indexed changes, modulo the previous `protobuf-java` downgrade. * Small create_repository.py format updates, fixes Added missing traliling comma in `COORDINATE_GROUPS[0]`. Updated `get_label` to check for groups starting with `org.scala-lang.` Sorted `deps` in `to_rules_scala_compatible_dict` and always sets `deps` in the return value, even when empty. * Ensure Scala 2 libs get "_2" repo names in Scala 3 Without this change, it was possible for Scala 3 core library versions to become overwritten with Scala 2 versions specified as dependencies of other jars. Specifically, all the `@io_bazel_rules_scala_scala_*_2` deps explicitly added to the `scala_3_{1,2,3,4,5}.bzl` files in #1631 for Scalafmt get stripped of the `_2` suffix before this change. Also computed `is_scala_3` in `create_file` and passed it through where needed. At some point it might be worth refactoring the script into a proper object instead. * Bump Scalafmt to 3.8.3 in create_repository.py This matches the version set in #1631. * Prevent create_repository.py from downgrading jars The script will now only update an existing artifact if the newly resolved version is greater than that already in the third_party/repositories file. Part of #1482. Tested by running on `master` both before and after #1631. The script produced no new changes, and is even faster now, since it also won't try to downgrade existing artifact versions. More specifically, this change prevents the following downgrades: - scalap: from latest Scala version to an earlier version, including 2.13.14 to 2.13.11 - com.lihaoyi.pprint: from pprint_3:0.9.0 to pprint_2.13:0.6.4 - compiler-interface: 1.10.1 to 1.3.5 or 1.9.6 - com.lihaoyi.geny: from geny_3:1.1.1 to geny_2.13:0.6.5 * Ensure all artifact deps written to repo files Updates the main loop in `create_file` to iterate over the newly generated artifact dictionary, not the original dictionary parsed from the existing file. This ensures that all dependencies of the updated artifacts are written to the file. This also improves performance when rerunning, since the script will no longer keep trying to fetch the previously missing artifacts. Tested by running once, `git add`ing the results, and running again to ensure the second run was a no-op. * Update get_label, fix scala3 label bug Fixes a bug in the previous implementation whereby the `scala3_` components of org.scala-lang artifact labels listed in an artifact's `deps` weren't transformed to `scala_`. Improved the handling of org.scala-lang artifacts more generally via the new `adjust_scala_lang_label` function. Refactored the `COORDINATE_GROUPS` array into a series of separate `set` variables named after their `get_label` transformations. Makes the `get_label` implementation read a little more easily. * Replace `split_artifact_and_version` I finally noticed that `get_maven_coordinates` already did the same parsing, essentially. I migrated the comment and implementation from `split_artifact_and_version` into it. * Check against current ResolvedArtifact instances Updates `create_artifact_version_map` to `create_current_resolved_artifacts_map` to set up future improvements. Specifically, an existing artifact's dependency coordinates aren't updated unless the resolved artifact version is newer. A small change after this will ensure all existing dependency coordinates get updated. This could be a one-off change, or we could decide to keep it; either way, it would be a very small one to make. Also: - Added `MavenCoordinates.artifact_name` to replace `artifact_name_from_maven_coordinates`. - Renamed `is_newer_than_current_version` to `is_newer_version`. - Added logic in `create_artifact_metadata_map` to ensure values with `testonly` set to `True` aren't part of the automatic resolution process. * Handle scalap, org.thesamet.scalapb special cases Preserves the existing labels for these artifacts. Tested by running in a new branch that updates the direct dependencies of core artifacts. Prior to this change, the `scalap` repo and some of the `org.thesamet.scalapb` repos were duplicated with a new label. After this change, that duplication disappears. * Add --version, emit command, stderr only on error These are changes suggested by @WojciechMazur in #1639. * Raise and catch SubprocessError, return exit code Further embellishment of the suggestions from @WojciechMazur in #1639. This way we can see exactly which version updates encountered an error, and how many, while continuing to attempt to update other versions. * Hoist output_dir from create_or_update_repo_file Makes the logic slightly easier to follow. * Hoist OUTPUT_DIR as top level constant I still like passing it as an argument to `create_or_update_repository_file`. * Add --output_dir flag, add defaults to --help Making the output directory configurable from the command line was the next logical extension. This makes it possible to see what the script will generate from scratch without having to erase the existing repo files. Figured it would be nice to see the default values in the `--help` output. * Create a new empty file if no previous file exists Decided it might be better to literally start from scratch in `--output_dir` than trying to copy a file from `OUTPUT_DIR`. One could always copy files from `OUTPUT_DIR` before running the script if so desired. Also changed the `__main__` logic to exit immediately on `CreateRepositoryError` rather than attempt to keep going. * Remove redundant `file.exists()` check Eliminated the check in `create_or_update_repository_file` in favor of that in `copy_previous_version_or_create_new_file_if_missing`. * Improve `create_repository.py --help` docstrings * Refactor `get_label` in `create_repository.py` `get_label` now uses the `SPECIAL_CASE_GROUP_LABELS` dict instead of the `LAST_GROUP_COMPONENT_GROUP` and `NEXT_TO_LAST_GROUP_COMPONENT_GROUP` sets. Also added `com.google.guava` to `ARTIFACT_LABEL_ONLY_GROUPS` to generate the correct `io_bazel_rules_scala_guava` label. Added `com.google.api.grpc` and `dev.dirs.directories` to `SCALA_PROTO_RULES_GROUPS`. Updated indentation of `ARTIFACT_LABEL_ONLY_GROUPS` and `GROUP_AND_ARTIFACT_LABEL_GROUPS` elements. * Emit correct label for scala-collection-compat Specifically, `org_scala_lang_modules_scala_collection_compat`.
This catches errors that happen at any point in `ExtraProtobufGenerator.run`, not just within `handleCodeGeneratorRequest`. This, in turn, prevents more cases of `ProtoScalaPBRule` aspect workers hanging. Part of #1482. As I started experimenting with updating gRPC and ScalaPB libs, I bumped the ScalaPB libs to 1.0.0-alpha.1 to try to resolve a problem. The `ProtoScalaPBRule` workers then started hanging again. After pulling the `catch ... Throwable` block from #1630 up into `ExtraProtobufGenerator.run`, I got the following stack trace: ```txt $ bazel build //test/proto/... ERROR: .../external/com_google_protobuf/BUILD.bazel:334:14: ProtoScalaPBRule external/com_google_protobuf/wrappers_proto_jvm_extra_protobuf_generator_scalapb.srcjar failed: (Exit 1): scalapb_worker failed: error executing command (from target @com_google_protobuf//:wrappers_proto) bazel-out/.../bin/src/scala/scripts/scalapb_worker ... (remaining 2 arguments skipped) --jvm_extra_protobuf_generator_out: java.lang.IllegalAccessError: class scalapb.options.Scalapb$ScalaPbOptions tried to access method 'com.google.protobuf.LazyStringArrayList com.google.protobuf.LazyStringArrayList.emptyList()' (scalapb.options.Scalapb$ScalaPbOptions and com.google.protobuf.LazyStringArrayList are in unnamed module of loader 'app') at scalapb.options.Scalapb$ScalaPbOptions.<init>(Scalapb.java:5021) at scalapb.options.Scalapb$ScalaPbOptions.<clinit>(Scalapb.java:11165) at scalapb.options.Scalapb.<clinit>(Scalapb.java:24184) at scalapb.options.compiler.Scalapb$.registerAllExtensions(Scalapb.scala:8) at scalarules.test.extra_protobuf_generator.ExtraProtobufGenerator$.run(ExtraProtobufGenerator.scala:48) at protocbridge.frontend.PluginFrontend$.$anonfun$runWithBytes$1(PluginFrontend.scala:51) at scala.util.Try$.apply(Try.scala:213) at protocbridge.frontend.PluginFrontend$.runWithBytes(PluginFrontend.scala:51) at protocbridge.frontend.PluginFrontend$.runWithInputStream(PluginFrontend.scala:121) at protocbridge.frontend.PosixPluginFrontend$.$anonfun$prepare$2(PosixPluginFrontend.scala:40) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$1$$anon$2.block(ExecutionContextImpl.scala:75) at java.base/java.util.concurrent.ForkJoinPool.managedBlock(ForkJoinPool.java:3118) at scala.concurrent.impl.ExecutionContextImpl$DefaultThreadFactory$$anon$1.blockOn(ExecutionContextImpl.scala:87) at scala.concurrent.package$.blocking(package.scala:146) at protocbridge.frontend.PosixPluginFrontend$.$anonfun$prepare$1(PosixPluginFrontend.scala:38) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659) at scala.util.Success.$anonfun$map$1(Try.scala:255) at scala.util.Success.map(Try.scala:213) at scala.concurrent.Future.$anonfun$map$1(Future.scala:292) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:42) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:74) at java.base/java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1426) at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290) at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020) at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656) at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594) at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183) java.lang.RuntimeException: Exit with code 1 at scala.sys.package$.error(package.scala:30) at scripts.ScalaPBWorker$.work(ScalaPBWorker.scala:44) at io.bazel.rulesscala.worker.Worker.persistentWorkerMain(Worker.java:96) at io.bazel.rulesscala.worker.Worker.workerMain(Worker.java:49) at scripts.ScalaPBWorker$.main(ScalaPBWorker.scala:39) at scripts.ScalaPBWorker.main(ScalaPBWorker.scala) ERROR: .../external/com_google_protobuf/BUILD.bazel:334:14 scala @com_google_protobuf//:wrappers_proto failed: (Exit 1): scalapb_worker failed: error executing command (from target @com_google_protobuf//:wrappers_proto) bazel-out/.../bin/src/scala/scripts/scalapb_worker ... (remaining 2 arguments skipped) ``` Bumping to protobuf v28.2 resolved the issue, at the expense of sacrificing Bazel 6 support, as v21.7 was the last to support Bazel 6. But this issue is orthogonal to the original gRPC test failure issue, which I'll address in a future change.
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description
Motivation