diff --git a/.gitignore b/.gitignore index d2645e022c..af5d767bea 100644 --- a/.gitignore +++ b/.gitignore @@ -29,3 +29,4 @@ tags /etc/scripts/stack-scripts.cabal .hspec-failures better-cache/ +/subs/*/*.cabal diff --git a/ChangeLog.md b/ChangeLog.md index fb54f56f50..5a5dbb8c4d 100644 --- a/ChangeLog.md +++ b/ChangeLog.md @@ -60,9 +60,13 @@ Major changes: is uniquely identified by a commit id and an Hadrian "flavour" (Hadrian is the newer GHC build system), hence `compiler` can be set to use a GHC built from source with `ghc-git-COMMIT-FLAVOUR` +* `stack.yaml` now supports a `configure-options`, which are passed directly to + the `configure` step in the Cabal build process. See + [#1438](https://github.com/commercialhaskell/stack/issues/1438) * Remove support for building GHCJS itself. Future releases of Stack may remove GHCJS support entirely. +* Support for lock files for pinning exact project dependency versions Behavior changes: * `stack.yaml` now supports `snapshot`: a synonym for `resolver`. See [#4256](https://github.com/commercialhaskell/stack/issues/4256) @@ -116,6 +120,17 @@ Behavior changes: means that Stack will no longer have to force reconfigures as often. See [#3554](https://github.com/commercialhaskell/stack/issues/3554). +* When building a package, Stack takes a lock on the dist directory in + use to avoid multiple runs of Stack from trampling each others' + files. See + [#2730](https://github.com/commercialhaskell/stack/issues/2730). + +* Stack will check occassionally if there is a new version available and prompt + the user to upgrade. This will not incur any additional network traffic, as + it will piggy-back on the existing Hackage index updates. You can set + `recommend-stack-upgrade: false` to bypass this. See + [#1681](https://github.com/commercialhaskell/stack/issues/1681). + Other enhancements: * Defer loading up of files for local packages. This allows us to get diff --git a/doc/lock_files.md b/doc/lock_files.md index fc6260158b..fee779ccd5 100644 --- a/doc/lock_files.md +++ b/doc/lock_files.md @@ -18,11 +18,12 @@ set of input files. There are a few problems with making this work: To address this, we follow the (fairly standard) approach of having a _lock file_. The goal of the lock file is to cache completed -information about all packages and snapshot files so that: +locations of project, snapshot packages and snapshots themselves so that: * These files can be stored in source control * Users on other machines can reuse these lock files and get identical - build plans + build plans given that the used local packages and local snapshots are + the same on those machines * Rerunning `stack build` in the future is deterministic in the build plan, not depending on mutable state in the world like Hackage revisions @@ -31,8 +32,6 @@ information about all packages and snapshot files so that: to perform the build. However, by deterministic, we mean it either performs the same build or fails, never accidentally doing something different. -* Stack can quickly determine the build plan in the common case of no - changes to `stack.yaml` or snapshot files This document explains the contents of a lock file, how they are used, and how they are created and updated. @@ -42,11 +41,7 @@ and how they are created and updated. Relevant to this discussion, the `stack.yaml` file specifies: * Resolver (the parent snapshot) -* Compiler override * `extra-deps` -* Flags -* GHC options -* Hidden packages The resolver can either specify a compiler version or another snapshot file. This snapshot file can contain the same information referenced @@ -55,12 +50,7 @@ above for a `stack.yaml`, with the following differences: * The `extra-deps` are called `packages` * Drop packages can be included -Some of this information is, by its nature, complete. For example, the -"flags" field cannot be influenced by anything outside of the file -itself. - -On the other hand, some information in these files can be -incomplete. Consider: +Some information in these files can be incomplete. Consider: ```yaml resolver: lts-13.9 @@ -128,24 +118,16 @@ parsing of the additional files in the common case of no changes. The lock file contains the following information: -* The full snapshot definition information, including completed - package locations for both `extra-deps` and packages in +* Completed package locations for both `extra-deps` and packages in snapshot files * **NOTE** This only applies to _immutable_ packages. Mutable packages are not included in the lock file. * Completed information for the snapshot locations -* A hash of the `stack.yaml` file -* The snapshot hash, to bypass the need to recalculate this on each - run of Stack It looks like the following: ```yaml # Lock file, some message about the file being auto-generated -stack-yaml: - sha256: XXXX - size: XXXX # in bytes - snapshots: # Starts with the snapshot specified in stack.yaml, # then continues with the snapshot specified in each @@ -163,33 +145,22 @@ snapshots: url: https://raw.githubusercontent.com/commercialhaskell/stackage-snapshots/master/lts/13/9.yaml sha256: 83de9017d911cf7795f19353dba4d04bd24cd40622b7567ff61fc3f7223aa3ea -compiler: ghc-X.Y.Z - packages: - acme-missiles: - location: - # QUESTION: any reason we need to specify which snapshot file it came from? I don't think so... - original: https://hackage.haskell.org/package/acme-missiles-0.3.tar.gz - completed: - size: 1442 - url: https://hackage.haskell.org/package/acme-missiles-0.3.tar.gz - cabal-file: - size: 613 - sha256: 2ba66a092a32593880a87fb00f3213762d7bca65a687d45965778deb8694c5d1 - name: acme-missiles - version: '0.3' - sha256: e563d8b524017a06b32768c4db8eff1f822f3fb22a90320b7e414402647b735b - pantry-tree: - size: 226 - sha256: 614bc0cca76937507ea0a5ccc17a504c997ce458d7f2f9e43b15a10c8eaeb033 - flags: ... - hidden: true/false - ghc-options: [...] +- original: https://hackage.haskell.org/package/acme-missiles-0.3.tar.gz + completed: + size: 1442 + url: https://hackage.haskell.org/package/acme-missiles-0.3.tar.gz + cabal-file: + size: 613 + sha256: 2ba66a092a32593880a87fb00f3213762d7bca65a687d45965778deb8694c5d1 + name: acme-missiles + version: '0.3' + sha256: e563d8b524017a06b32768c4db8eff1f822f3fb22a90320b7e414402647b735b + pantry-tree: + size: 226 + sha256: 614bc0cca76937507ea0a5ccc17a504c997ce458d7f2f9e43b15a10c8eaeb033 ``` -**NOTE** The `original` fields may seem superfluous at first. See the -update procedure below for an explanation. - ## Creation Whenever a `stack.yaml` file is loaded, Stack checks for a lock file @@ -206,36 +177,9 @@ If the lock file does not exist, it will be created by: * Completing all missing information * Writing out the new `stack.yaml.lock` file -## Dirtiness checking - -If the `stack.yaml.lock` file exists, its last modification time is -compared against the last modification time of the `stack.yaml` file -and any local snapshot files. If any of those files is more recent -than the `stack.yaml` file, and the file hashes in the lock file -do not match the files on the filesystem, then the update procedure is -triggered. Otherwise, the `stack.yaml.lock` file can be used as the -definition of the snapshot. - ## Update procedure -The simplest possible implementation is: ignore the lock file entirely -and create a new one followign the creation steps above. There's a -significant downside to this, however: it may cause a larger delta in -the lock file than intended, by causing more packages to be -updates. For example, many packages from Hackage may have their -Hackage revision information updated unnecessarily. - -The more complicated update procedure is described below. **QUESTION** -Do we want to go the easy way at first and later implement the more -complicated update procedure? - -1. Create a map from original package location to completed package - location in the lock file -2. Load up each snapshot file -3. For each incomplete package location: - * Lookup the value in the map created in (1) - * If present: use that completed information - * Otherwise: complete the information using the same completion - procedure from Pantry as in "creation" - -This should minimize the number of changes to packages incurred. +When loading a Stack project all completed package or snapshot locations +(even when they were completed using information from a lock file) get +collected to form a new lock file in memory and compare against the one +on disk, writing if there are any differences. diff --git a/doc/yaml_configuration.md b/doc/yaml_configuration.md index 1743e4a0a0..808f6a33f1 100644 --- a/doc/yaml_configuration.md +++ b/doc/yaml_configuration.md @@ -585,6 +585,24 @@ on an options change, but this behavior can be changed back with the following: rebuild-ghc-options: true ``` +### configure-options + +Options which are passed to the configure step of the Cabal build process. +These can either be set by package name, or using the `$everything`, +`$targets`, and `$locals` special keys. These special keys have the same +meaning as in `ghc-options`. + +```yaml +configure-options: + $everything: + - --with-gcc + - /some/path + my-package: + - --another-flag +``` + +(Since 2.0) + ### ghc-variant (Since 0.1.5) @@ -1072,4 +1090,12 @@ Build output when disabled: ... ``` +### recommend-stack-upgrade + +When Stack notices that a new version of Stack is available, should it notify the user? + +```yaml +recommend-stack-upgrade: true +``` + Since 2.0 diff --git a/package.yaml b/package.yaml index ddf427abe0..de521b734a 100644 --- a/package.yaml +++ b/package.yaml @@ -1,5 +1,5 @@ name: stack -version: '1.10.0' +version: '2.0.0' synopsis: The Haskell Tool Stack description: ! 'Please see the README.md for usage information, and the wiki on Github for more details. Also, note that @@ -123,6 +123,7 @@ dependencies: - yaml - zip-archive - zlib +- binary when: - condition: os(windows) then: @@ -183,6 +184,8 @@ library: - Stack.IDE - Stack.Init - Stack.Ls + - Stack.Lock + - Stack.ModuleInterface - Stack.New - Stack.Nix - Stack.Options.BenchParser @@ -302,6 +305,7 @@ tests: dependencies: - QuickCheck - hspec + - raw-strings-qq - stack - smallcheck flags: diff --git a/snapshot-lts-12.yaml b/snapshot-lts-12.yaml index f667e35dd4..9e8fa9b40e 100644 --- a/snapshot-lts-12.yaml +++ b/snapshot-lts-12.yaml @@ -10,6 +10,8 @@ packages: - yaml-0.10.4.0@rev:0 #for hpack-0.31 - persistent-2.9.2@rev:0 - persistent-sqlite-2.9.3@rev:0 +- github: snoyberg/filelock + commit: 97e83ecc133cd60a99df8e1fa5a3c2739ad007dc drop-packages: # See https://github.com/commercialhaskell/stack/pull/4712 diff --git a/snapshot-nightly.yaml b/snapshot-nightly.yaml index 511f4d6d7b..9935e3db81 100644 --- a/snapshot-nightly.yaml +++ b/snapshot-nightly.yaml @@ -4,6 +4,8 @@ name: snapshot-for-building-stack-with-ghc-8.6.2 packages: - persistent-2.9.2@rev:0 - persistent-sqlite-2.9.3@rev:0 +- github: snoyberg/filelock + commit: 97e83ecc133cd60a99df8e1fa5a3c2739ad007dc drop-packages: # See https://github.com/commercialhaskell/stack/pull/4712 diff --git a/snapshot.yaml b/snapshot.yaml index 2ad267e02e..9a512aac30 100644 --- a/snapshot.yaml +++ b/snapshot.yaml @@ -19,6 +19,8 @@ packages: - process-1.6.3.0@sha256:fc77cfe75a9653b8c54ae455ead8c06cb8adc4d7a340984d84d8ca880b579919,2370 #because of https://github.com/haskell/process/pull/101 - persistent-2.9.2@rev:0 - persistent-sqlite-2.9.3@rev:0 +- github: snoyberg/filelock + commit: 97e83ecc133cd60a99df8e1fa5a3c2739ad007dc drop-packages: # See https://github.com/commercialhaskell/stack/pull/4712 diff --git a/src/Stack/Build.hs b/src/Stack/Build.hs index fe52dc4b22..557baa07c7 100644 --- a/src/Stack/Build.hs +++ b/src/Stack/Build.hs @@ -244,9 +244,10 @@ loadPackage :: (HasBuildConfig env, HasSourceMap env) => PackageLocationImmutable -> Map FlagName Bool - -> [Text] + -> [Text] -- ^ GHC options + -> [Text] -- ^ Cabal configure options -> RIO env Package -loadPackage loc flags ghcOptions = do +loadPackage loc flags ghcOptions cabalConfigOpts = do compiler <- view actualCompilerVersionL platform <- view platformL let pkgConfig = PackageConfig @@ -254,6 +255,7 @@ loadPackage loc flags ghcOptions = do , packageConfigEnableBenchmarks = False , packageConfigFlags = flags , packageConfigGhcOptions = ghcOptions + , packageConfigCabalConfigOpts = cabalConfigOpts , packageConfigCompilerVersion = compiler , packageConfigPlatform = platform } diff --git a/src/Stack/Build/ConstructPlan.hs b/src/Stack/Build/ConstructPlan.hs index d2d29970d3..18de09371b 100644 --- a/src/Stack/Build/ConstructPlan.hs +++ b/src/Stack/Build/ConstructPlan.hs @@ -118,7 +118,7 @@ type M = RWST -- TODO replace with more efficient WS stack on top of StackT data Ctx = Ctx { baseConfigOpts :: !BaseConfigOpts - , loadPackage :: !(PackageLocationImmutable -> Map FlagName Bool -> [Text] -> M Package) + , loadPackage :: !(PackageLocationImmutable -> Map FlagName Bool -> [Text] -> [Text] -> M Package) , combinedMap :: !CombinedMap , ctxEnvConfig :: !EnvConfig , callStack :: ![PackageName] @@ -171,7 +171,7 @@ instance HasEnvConfig Ctx where constructPlan :: forall env. HasEnvConfig env => BaseConfigOpts -> [DumpPackage] -- ^ locally registered - -> (PackageLocationImmutable -> Map FlagName Bool -> [Text] -> RIO EnvConfig Package) -- ^ load upstream package + -> (PackageLocationImmutable -> Map FlagName Bool -> [Text] -> [Text] -> RIO EnvConfig Package) -- ^ load upstream package -> SourceMap -> InstalledMap -> Bool @@ -231,8 +231,8 @@ constructPlan baseConfigOpts0 localDumpPkgs loadPackage0 sourceMap installedMap mkCtx econfig globalCabalVersion sources mcur pathEnvVar' = Ctx { baseConfigOpts = baseConfigOpts0 - , loadPackage = \x y z -> runRIO econfig $ - applyForceCustomBuild globalCabalVersion <$> loadPackage0 x y z + , loadPackage = \w x y z -> runRIO econfig $ + applyForceCustomBuild globalCabalVersion <$> loadPackage0 w x y z , combinedMap = combineMap sources installedMap , ctxEnvConfig = econfig , callStack = [] @@ -426,7 +426,7 @@ addDep name = do -- names. This code does not feel right. let version = installedVersion installed askPkgLoc = liftRIO $ do - mrev <- getLatestHackageRevision name version + mrev <- getLatestHackageRevision YesRequireHackageIndex name version case mrev of Nothing -> do -- this could happen for GHC boot libraries missing from Hackage @@ -469,7 +469,7 @@ tellExecutablesUpstream name retrievePkgLoc loc flags = do when (name `Set.member` wanted ctx) $ do mPkgLoc <- retrievePkgLoc forM_ mPkgLoc $ \pkgLoc -> do - p <- loadPackage ctx pkgLoc flags [] + p <- loadPackage ctx pkgLoc flags [] [] tellExecutablesPackage loc p tellExecutablesPackage :: InstallLocation -> Package -> M () @@ -505,7 +505,7 @@ installPackage name ps minstalled = do case ps of PSRemote pkgLoc _version _fromSnaphot cp -> do planDebug $ "installPackage: Doing all-in-one build for upstream package " ++ show name - package <- loadPackage ctx pkgLoc (cpFlags cp) (cpGhcOptions cp) + package <- loadPackage ctx pkgLoc (cpFlags cp) (cpGhcOptions cp) (cpCabalConfigOpts cp) resolveDepsAndInstall True (cpHaddocks cp) ps package minstalled PSFilePath lp -> do case lpTestBench lp of @@ -662,7 +662,7 @@ addPackageDeps package = do eres <- addDep depname let getLatestApplicableVersionAndRev :: M (Maybe (Version, BlobKey)) getLatestApplicableVersionAndRev = do - vsAndRevs <- runRIO ctx $ getHackagePackageVersions UsePreferredVersions depname + vsAndRevs <- runRIO ctx $ getHackagePackageVersions YesRequireHackageIndex UsePreferredVersions depname pure $ do lappVer <- latestApplicableVersion range $ Map.keysSet vsAndRevs revs <- Map.lookup lappVer vsAndRevs @@ -1057,7 +1057,7 @@ pprintExceptions exceptions stackYaml stackRoot parentMap wanted' prunedGlobalDe pprintExtra (name, (version, BlobKey cabalHash cabalSize)) = let cfInfo = CFIHash cabalHash (Just cabalSize) packageIdRev = PackageIdentifierRevision name version cfInfo - in "- " <+> fromString (T.unpack (utf8BuilderToText (RIO.display packageIdRev))) + in fromString ("- " ++ T.unpack (utf8BuilderToText (RIO.display packageIdRev))) allNotInBuildPlan = Set.fromList $ concatMap toNotInBuildPlan exceptions' toNotInBuildPlan (DependencyPlanFailures _ pDeps) = diff --git a/src/Stack/Build/Execute.hs b/src/Stack/Build/Execute.hs index 4570542049..cfb4eec702 100644 --- a/src/Stack/Build/Execute.hs +++ b/src/Stack/Build/Execute.hs @@ -84,12 +84,14 @@ import Stack.Types.Version import qualified System.Directory as D import System.Environment (getExecutablePath, lookupEnv) import System.Exit (ExitCode (..)) +import System.FileLock (withTryFileLock, SharedExclusive (Exclusive), withFileLock) import qualified System.FilePath as FP import System.IO (stderr, stdout) import System.PosixCompat.Files (createLink, modificationTime, getFileStatus) import System.PosixCompat.Time (epochTime) import RIO.PrettyPrint import RIO.Process +import Pantry.Internal.Companion -- | Has an executable been built or not? data ExecutableBuildStatus @@ -924,10 +926,45 @@ packageNamePrefix ee name' = Just len -> assert (len >= length name) $ RIO.take len $ name ++ repeat ' ' in fromString paddedName <> "> " -announceTask :: HasLogFunc env => ExecuteEnv -> Task -> Text -> RIO env () +announceTask :: HasLogFunc env => ExecuteEnv -> Task -> Utf8Builder -> RIO env () announceTask ee task action = logInfo $ packageNamePrefix ee (pkgName (taskProvides task)) <> - RIO.display action + action + +-- | Ensure we're the only action using the directory. See +-- +withLockedDistDir + :: HasEnvConfig env + => (Utf8Builder -> RIO env ()) -- ^ announce + -> Path Abs Dir -- ^ root directory for package + -> RIO env a + -> RIO env a +withLockedDistDir announce root inner = do + distDir <- distRelativeDir + let lockFP = root distDir relFileBuildLock + ensureDir $ parent lockFP + + mres <- + withRunInIO $ \run -> + withTryFileLock (toFilePath lockFP) Exclusive $ \_lock -> + run inner + + case mres of + Just res -> pure res + Nothing -> do + let complainer delay = do + delay 5000000 -- 5 seconds + announce $ "blocking for directory lock on " <> fromString (toFilePath lockFP) + forever $ do + delay 30000000 -- 30 seconds + announce $ "still blocking for directory lock on " <> + fromString (toFilePath lockFP) <> + "; maybe another Stack process is running?" + withCompanion complainer $ + \stopComplaining -> + withRunInIO $ \run -> + withFileLock (toFilePath lockFP) Exclusive $ \_ -> + run $ stopComplaining *> inner -- | How we deal with output from GHC, either dumping to a log file or the -- console (with some prefix). @@ -962,7 +999,7 @@ withSingleContext :: forall env a. HasEnvConfig env -- argument, but we provide both to avoid recalculating `parent` of the `File`. -> (KeepOutputOpen -> ExcludeTHLoading -> [String] -> RIO env ()) -- Function to run Cabal with args - -> (Text -> RIO env ()) -- An 'announce' function, for different build phases + -> (Utf8Builder -> RIO env ()) -- An 'announce' function, for different build phases -> OutputType -> RIO env a) -> RIO env a @@ -995,7 +1032,10 @@ withSingleContext ActionContext {..} ee@ExecuteEnv {..} task@Task {..} mdeps msu withPackage inner = case taskType of - TTLocalMutable lp -> inner (lpPackage lp) (lpCabalFile lp) (parent $ lpCabalFile lp) + TTLocalMutable lp -> do + let root = parent $ lpCabalFile lp + withLockedDistDir announce root $ + inner (lpPackage lp) (lpCabalFile lp) root TTRemotePackage _ package pkgloc -> do suffix <- parseRelDir $ packageIdentifierString $ packageIdent package let dir = eeTempDir suffix @@ -1478,7 +1518,7 @@ singleBuild ac@ActionContext {..} ee@ExecuteEnv {..} task@Task {..} installedMap ("Building all executables for `" <> fromString (packageNameString (packageName package)) <> "' once. After a successful build of all of them, only specified executables will be rebuilt.")) - _neededConfig <- ensureConfig cache pkgDir ee (announce ("configure" <> annSuffix executableBuildStatuses)) cabal cabalfp task + _neededConfig <- ensureConfig cache pkgDir ee (announce ("configure" <> RIO.display (annSuffix executableBuildStatuses))) cabal cabalfp task let installedMapHasThisPkg :: Bool installedMapHasThisPkg = @@ -1501,7 +1541,7 @@ singleBuild ac@ActionContext {..} ee@ExecuteEnv {..} task@Task {..} installedMap Just <$> realBuild cache package pkgDir cabal0 announce executableBuildStatuses initialBuildSteps executableBuildStatuses cabal announce = do - () <- announce ("initial-build-steps" <> annSuffix executableBuildStatuses) + announce ("initial-build-steps" <> RIO.display (annSuffix executableBuildStatuses)) cabal KeepTHLoading ["repl", "stack-initial-build-steps"] realBuild @@ -1509,7 +1549,7 @@ singleBuild ac@ActionContext {..} ee@ExecuteEnv {..} task@Task {..} installedMap -> Package -> Path Abs Dir -> (KeepOutputOpen -> ExcludeTHLoading -> [String] -> RIO env ()) - -> (Text -> RIO env ()) + -> (Utf8Builder -> RIO env ()) -> Map Text ExecutableBuildStatus -> RIO env Installed realBuild cache package pkgDir cabal0 announce executableBuildStatuses = do @@ -1552,7 +1592,7 @@ singleBuild ac@ActionContext {..} ee@ExecuteEnv {..} task@Task {..} installedMap line <> line <> "Missing modules in the cabal file are likely to cause undefined reference errors from the linker, along with other problems." - () <- announce ("build" <> annSuffix executableBuildStatuses) + () <- announce ("build" <> RIO.display (annSuffix executableBuildStatuses)) config <- view configL extraOpts <- extraBuildOptions wc eeBuildOpts let stripTHLoading @@ -1893,7 +1933,7 @@ singleTest topts testsToRun ac ee task installedMap = do argsDisplay = case args of [] -> "" _ -> ", args: " <> T.intercalate " " (map showProcessArgDebug args) - announce $ "test (suite: " <> testName <> argsDisplay <> ")" + announce $ "test (suite: " <> RIO.display testName <> RIO.display argsDisplay <> ")" -- Clear "Progress: ..." message before -- redirecting output. @@ -1951,7 +1991,7 @@ singleTest topts testsToRun ac ee task installedMap = do -- tidiness. when needHpc $ updateTixFile (packageName package) tixPath testName' - let announceResult result = announce $ "Test suite " <> testName <> " " <> result + let announceResult result = announce $ "Test suite " <> RIO.display testName <> " " <> result case mec of Just ExitSuccess -> do announceResult "passed" @@ -2125,9 +2165,8 @@ extraBuildOptions :: (HasEnvConfig env, HasRunner env) => WhichCompiler -> BuildOpts -> RIO env [String] extraBuildOptions wc bopts = do colorOpt <- appropriateGhcColorFlag - let ddumpOpts = " -ddump-hi -ddump-to-file" - optsFlag = compilerOptionsCabalFlag wc - baseOpts = ddumpOpts ++ maybe "" (" " ++) colorOpt + let optsFlag = compilerOptionsCabalFlag wc + baseOpts = maybe "" (" " ++) colorOpt if toCoverage (boptsTestOpts bopts) then do hpcIndexDir <- toFilePathNoTrailingSep <$> hpcRelativeDir diff --git a/src/Stack/Build/Source.hs b/src/Stack/Build/Source.hs index 1b77fd31a2..0e69ea8a7f 100644 --- a/src/Stack/Build/Source.hs +++ b/src/Stack/Build/Source.hs @@ -85,6 +85,8 @@ loadSourceMap smt boptsCli sma = do flags = getLocalFlags boptsCli name ghcOptions = generalGhcOptions bconfig boptsCli isTarget isProjectPackage + cabalConfigOpts = + loadCabalConfigOpts bconfig (cpName common) isTarget isProjectPackage in common { cpFlags = if M.null flags @@ -92,6 +94,8 @@ loadSourceMap smt boptsCli sma = do else flags , cpGhcOptions = ghcOptions ++ cpGhcOptions common + , cpCabalConfigOpts = + cabalConfigOpts ++ cpCabalConfigOpts common , cpHaddocks = if isTarget then boptsHaddock bopts @@ -165,10 +169,12 @@ depPackageHashableContent DepPackage {..} = do else "-" <> fromString (C.unFlagName f) flags = map flagToBs $ Map.toList (cpFlags dpCommon) ghcOptions = map display (cpGhcOptions dpCommon) + cabalConfigOpts = map display (cpCabalConfigOpts dpCommon) haddocks = if cpHaddocks dpCommon then "haddocks" else "" hash = immutableLocSha pli return $ hash <> haddocks <> getUtf8Builder (mconcat flags) <> - getUtf8Builder (mconcat ghcOptions) + getUtf8Builder (mconcat ghcOptions) <> + getUtf8Builder (mconcat cabalConfigOpts) -- | All flags for a local package. getLocalFlags @@ -182,6 +188,21 @@ getLocalFlags boptsCli name = Map.unions where cliFlags = boptsCLIFlags boptsCli +-- | Get the options to pass to @./Setup.hs configure@ +loadCabalConfigOpts :: BuildConfig -> PackageName -> Bool -> Bool -> [Text] +loadCabalConfigOpts bconfig name isTarget isLocal = concat + [ Map.findWithDefault [] CCKEverything (configCabalConfigOpts config) + , if isLocal + then Map.findWithDefault [] CCKLocals (configCabalConfigOpts config) + else [] + , if isTarget + then Map.findWithDefault [] CCKTargets (configCabalConfigOpts config) + else [] + , Map.findWithDefault [] (CCKPackage name) (configCabalConfigOpts config) + ] + where + config = view configL bconfig + -- | Get the configured options to pass from GHC, based on the build -- configuration and commandline. generalGhcOptions :: BuildConfig -> BuildOptsCLI -> Bool -> Bool -> [Text] @@ -230,7 +251,7 @@ loadCommonPackage :: => CommonPackage -> RIO env Package loadCommonPackage common = do - config <- getPackageConfig (cpFlags common) (cpGhcOptions common) + config <- getPackageConfig (cpFlags common) (cpGhcOptions common) (cpCabalConfigOpts common) gpkg <- liftIO $ cpGPD common return $ resolvePackage config gpkg @@ -245,7 +266,7 @@ loadLocalPackage pp = do let common = ppCommon pp bopts <- view buildOptsL mcurator <- view $ buildConfigL.to bcCurator - config <- getPackageConfig (cpFlags common) (cpGhcOptions common) + config <- getPackageConfig (cpFlags common) (cpGhcOptions common) (cpCabalConfigOpts common) gpkg <- ppGPD pp let name = cpName common mtarget = M.lookup name (smtTargets $ smTargets sm) @@ -496,9 +517,10 @@ calcFci modTime' fp = liftIO $ getPackageConfig :: (HasBuildConfig env, HasSourceMap env) => Map FlagName Bool - -> [Text] + -> [Text] -- ^ GHC options + -> [Text] -- ^ cabal config opts -> RIO env PackageConfig -getPackageConfig flags ghcOptions = do +getPackageConfig flags ghcOptions cabalConfigOpts = do platform <- view platformL compilerVersion <- view actualCompilerVersionL return PackageConfig @@ -506,6 +528,7 @@ getPackageConfig flags ghcOptions = do , packageConfigEnableBenchmarks = False , packageConfigFlags = flags , packageConfigGhcOptions = ghcOptions + , packageConfigCabalConfigOpts = cabalConfigOpts , packageConfigCompilerVersion = compilerVersion , packageConfigPlatform = platform } diff --git a/src/Stack/Build/Target.hs b/src/Stack/Build/Target.hs index 6f5875ae28..11defe344a 100644 --- a/src/Stack/Build/Target.hs +++ b/src/Stack/Build/Target.hs @@ -343,7 +343,7 @@ resolveRawTarget sma allLocs (ri, rt) = ] -- Not present at all, add it from Hackage Nothing -> do - mrev <- getLatestHackageRevision name version + mrev <- getLatestHackageRevision YesRequireHackageIndex name version pure $ case mrev of Nothing -> deferToConstructPlan name Just (_rev, cfKey, treeKey) -> Right ResolveResult @@ -355,7 +355,7 @@ resolveRawTarget sma allLocs (ri, rt) = } hackageLatest name = do - mloc <- getLatestHackageLocation name UsePreferredVersions + mloc <- getLatestHackageLocation YesRequireHackageIndex name UsePreferredVersions pure $ case mloc of Nothing -> deferToConstructPlan name Just loc -> do @@ -368,7 +368,7 @@ resolveRawTarget sma allLocs (ri, rt) = } hackageLatestRevision name version = do - mrev <- getLatestHackageRevision name version + mrev <- getLatestHackageRevision YesRequireHackageIndex name version pure $ case mrev of Nothing -> deferToConstructPlan name Just (_rev, cfKey, treeKey) -> Right ResolveResult diff --git a/src/Stack/BuildPlan.hs b/src/Stack/BuildPlan.hs index 0dd5cc7d8c..3412ffeb17 100644 --- a/src/Stack/BuildPlan.hs +++ b/src/Stack/BuildPlan.hs @@ -165,6 +165,7 @@ gpdPackageDeps gpd ac platform flags = , packageConfigEnableBenchmarks = True , packageConfigFlags = flags , packageConfigGhcOptions = [] + , packageConfigCabalConfigOpts = [] , packageConfigCompilerVersion = ac , packageConfigPlatform = platform } diff --git a/src/Stack/Config.hs b/src/Stack/Config.hs index 1b69107aec..539f5a5e45 100644 --- a/src/Stack/Config.hs +++ b/src/Stack/Config.hs @@ -62,6 +62,7 @@ import Stack.Config.Docker import Stack.Config.Nix import Stack.Constants import Stack.Build.Haddock (shouldHaddockDeps) +import Stack.Lock (lockCachedWanted) import Stack.Storage (initStorage) import Stack.SourceMap import Stack.Types.Build @@ -76,6 +77,7 @@ import System.Console.ANSI (hSupportsANSIWithoutEmulation) import System.Environment import System.PosixCompat.Files (fileOwner, getFileStatus) import System.PosixCompat.User (getEffectiveUserID) +import RIO.List (unzip) import RIO.PrettyPrint (stylesUpdateL, useColorL) import RIO.Process @@ -296,6 +298,7 @@ configFromConfigMonoid let configTemplateParams = configMonoidTemplateParameters configScmInit = getFirst configMonoidScmInit + configCabalConfigOpts = coerce configMonoidCabalConfigOpts configGhcOptionsByName = coerce configMonoidGhcOptionsByName configGhcOptionsByCat = coerce configMonoidGhcOptionsByCat configSetupInfoLocations = configMonoidSetupInfoLocations @@ -310,6 +313,7 @@ configFromConfigMonoid configSaveHackageCreds = fromFirst True configMonoidSaveHackageCreds configHackageBaseUrl = fromFirst "https://hackage.haskell.org/" configMonoidHackageBaseUrl configHideSourcePaths = fromFirstTrue configMonoidHideSourcePaths + configRecommendUpgrade = fromFirstTrue configMonoidRecommendUpgrade configAllowDifferentUser <- case getFirst configMonoidAllowDifferentUser of @@ -501,12 +505,51 @@ loadBuildConfig = do { projectCompiler = mcompiler <|> projectCompiler project' , projectResolver = fromMaybe (projectResolver project') mresolver } + extraPackageDBs <- mapM resolveDir' (projectExtraPackageDBs project) - resolver <- completeSnapshotLocation $ projectResolver project - (snapshot, _completed) <- loadAndCompleteSnapshot resolver + wanted <- lockCachedWanted stackYamlFP (projectResolver project) $ + fillProjectWanted stackYamlFP config project - extraPackageDBs <- mapM resolveDir' (projectExtraPackageDBs project) + return BuildConfig + { bcConfig = config + , bcSMWanted = wanted + , bcExtraPackageDBs = extraPackageDBs + , bcStackYaml = stackYamlFP + , bcCurator = projectCurator project + } + where + getEmptyProject :: Maybe RawSnapshotLocation -> [PackageIdentifierRevision] -> RIO Config Project + getEmptyProject mresolver extraDeps = do + r <- case mresolver of + Just resolver -> do + logInfo ("Using resolver: " <> display resolver <> " specified on command line") + return resolver + Nothing -> do + r'' <- getLatestResolver + logInfo ("Using latest snapshot resolver: " <> display r'') + return r'' + return Project + { projectUserMsg = Nothing + , projectPackages = [] + , projectDependencies = map (RPLImmutable . flip RPLIHackage Nothing) extraDeps + , projectFlags = mempty + , projectResolver = r + , projectCompiler = Nothing + , projectExtraPackageDBs = [] + , projectCurator = Nothing + , projectDropPackages = mempty + } +fillProjectWanted :: + (HasProcessContext env, HasLogFunc env, HasPantryConfig env) + => Path Abs t + -> Config + -> Project + -> Map RawPackageLocationImmutable PackageLocationImmutable + -> WantedCompiler + -> Map PackageName (Bool -> RIO env DepPackage) + -> RIO env (SMWanted, [CompletedPLI]) +fillProjectWanted stackYamlFP config project locCache snapCompiler snapPackages = do let bopts = configBuild config packages0 <- for (projectPackages project) $ \fp@(RelFilePath t) -> do @@ -515,25 +558,27 @@ loadBuildConfig = do pp <- mkProjectPackage YesPrintWarnings resolved (boptsHaddock bopts) pure (cpName $ ppCommon pp, pp) - let completeLocation (RPLMutable m) = pure $ PLMutable m - completeLocation (RPLImmutable im) = PLImmutable <$> completePackageLocation im - - deps0 <- forM (projectDependencies project) $ \rpl -> do - pl <- completeLocation rpl + (deps0, mcompleted) <- fmap unzip . forM (projectDependencies project) $ \rpl -> do + (pl, mCompleted) <- case rpl of + RPLImmutable rpli -> do + compl <- maybe (completePackageLocation rpli) pure (Map.lookup rpli locCache) + pure (PLImmutable compl, Just (rpli, compl)) + RPLMutable p -> + pure (PLMutable p, Nothing) dp <- additionalDepPackage (shouldHaddockDeps bopts) pl - pure (cpName $ dpCommon dp, dp) + pure ((cpName $ dpCommon dp, dp), mCompleted) checkDuplicateNames $ map (second (PLMutable . ppResolvedDir)) packages0 ++ map (second dpLocation) deps0 let packages1 = Map.fromList packages0 - snPackages = snapshotPackages snapshot + snPackages = snapPackages `Map.difference` packages1 `Map.difference` Map.fromList deps0 `Map.withoutKeys` projectDropPackages project - snDeps <- Map.traverseWithKey (snapToDepPackage (shouldHaddockDeps bopts)) snPackages + snDeps <- for snPackages $ \getDep -> getDep (shouldHaddockDeps bopts) let deps1 = Map.fromList deps0 `Map.union` snDeps @@ -559,41 +604,14 @@ loadBuildConfig = do throwM $ InvalidGhcOptionsSpecification (Map.keys unusedPkgGhcOptions) let wanted = SMWanted - { smwCompiler = fromMaybe (snapshotCompiler snapshot) (projectCompiler project) + { smwCompiler = fromMaybe snapCompiler (projectCompiler project) , smwProject = packages , smwDeps = deps , smwSnapshotLocation = projectResolver project } - return BuildConfig - { bcConfig = config - , bcSMWanted = wanted - , bcExtraPackageDBs = extraPackageDBs - , bcStackYaml = stackYamlFP - , bcCurator = projectCurator project - } - where - getEmptyProject :: Maybe RawSnapshotLocation -> [PackageIdentifierRevision] -> RIO Config Project - getEmptyProject mresolver extraDeps = do - r <- case mresolver of - Just resolver -> do - logInfo ("Using resolver: " <> display resolver <> " specified on command line") - return resolver - Nothing -> do - r'' <- getLatestResolver - logInfo ("Using latest snapshot resolver: " <> display r'') - return r'' - return Project - { projectUserMsg = Nothing - , projectPackages = [] - , projectDependencies = map (RPLImmutable . flip RPLIHackage Nothing) extraDeps - , projectFlags = mempty - , projectResolver = r - , projectCompiler = Nothing - , projectExtraPackageDBs = [] - , projectCurator = Nothing - , projectDropPackages = mempty - } + pure (wanted, catMaybes mcompleted) + -- | Check if there are any duplicate package names and, if so, throw an -- exception. diff --git a/src/Stack/Constants.hs b/src/Stack/Constants.hs index ceec2f9186..89da9f9a2c 100644 --- a/src/Stack/Constants.hs +++ b/src/Stack/Constants.hs @@ -120,6 +120,7 @@ module Stack.Constants ,hadrianCmdPosix ,usrLibDirs ,testGhcEnvRelFile + ,relFileBuildLock ) where @@ -599,3 +600,7 @@ usrLibDirs = [$(mkAbsDir "/usr/lib"),$(mkAbsDir "/usr/lib64")] -- | Relative file path for a temporary GHC environment file for tests testGhcEnvRelFile :: Path Rel File testGhcEnvRelFile = $(mkRelFile "test-ghc-env") + +-- | File inside a dist directory to use for locking +relFileBuildLock :: Path Rel File +relFileBuildLock = $(mkRelFile "build-lock") diff --git a/src/Stack/Dot.hs b/src/Stack/Dot.hs index 1234dcc9ed..0278e47daf 100644 --- a/src/Stack/Dot.hs +++ b/src/Stack/Dot.hs @@ -127,12 +127,12 @@ createDependencyGraph dotOpts = do let globalDumpMap = Map.fromList $ map (\dp -> (Stack.Prelude.pkgName (dpPackageIdent dp), dp)) globalDump globalIdMap = Map.fromList $ map (\dp -> (dpGhcPkgId dp, dpPackageIdent dp)) globalDump let depLoader = createDepLoader sourceMap globalDumpMap globalIdMap loadPackageDeps - loadPackageDeps name version loc flags ghcOptions + loadPackageDeps name version loc flags ghcOptions cabalConfigOpts -- Skip packages that can't be loaded - see -- https://github.com/commercialhaskell/stack/issues/2967 | name `elem` [mkPackageName "rts", mkPackageName "ghc"] = return (Set.empty, DotPayload (Just version) (Just $ Right BSD3)) - | otherwise = fmap (packageAllDeps &&& makePayload) (loadPackage loc flags ghcOptions) + | otherwise = fmap (packageAllDeps &&& makePayload) (loadPackage loc flags ghcOptions cabalConfigOpts) resolveDependencies (dotDependencyDepth dotOpts) graph depLoader where makePayload pkg = DotPayload (Just $ packageVersion pkg) (Just $ packageLicense pkg) @@ -265,7 +265,7 @@ createDepLoader :: SourceMap -> Map PackageName DumpPackage -> Map GhcPkgId PackageIdentifier -> (PackageName -> Version -> PackageLocationImmutable -> - Map FlagName Bool -> [Text] -> RIO DotConfig (Set PackageName, DotPayload)) + Map FlagName Bool -> [Text] -> [Text] -> RIO DotConfig (Set PackageName, DotPayload)) -> PackageName -> RIO DotConfig (Set PackageName, DotPayload) createDepLoader sourceMap globalDumpMap globalIdMap loadPackageDeps pkgName = do @@ -293,7 +293,8 @@ createDepLoader sourceMap globalDumpMap globalIdMap loadPackageDeps pkgName = do let PackageIdentifier name version = PD.package $ PD.packageDescription gpd flags = cpFlags common ghcOptions = cpGhcOptions common - assert (pkgName == name) (loadPackageDeps pkgName version loc flags ghcOptions) + cabalConfigOpts = cpCabalConfigOpts common + assert (pkgName == name) (loadPackageDeps pkgName version loc flags ghcOptions cabalConfigOpts) -- If package is a global package, use info from ghc-pkg (#4324, #3084) globalDeps = diff --git a/src/Stack/Freeze.hs b/src/Stack/Freeze.hs index c751b3145d..77fa033a0c 100644 --- a/src/Stack/Freeze.hs +++ b/src/Stack/Freeze.hs @@ -69,7 +69,7 @@ doFreeze p FreezeSnapshot = do case result of Left _wc -> logInfo "No freezing is required for compiler resolver" - Right (snap, _) -> do + Right snap -> do snap' <- completeSnapshotLayer snap let rawCompleted = toRawSnapshotLayer snap' if rawCompleted == snap diff --git a/src/Stack/Ghci.hs b/src/Stack/Ghci.hs index 40eed3f713..41b998137e 100644 --- a/src/Stack/Ghci.hs +++ b/src/Stack/Ghci.hs @@ -627,12 +627,17 @@ loadGhciPkgDesc buildOptsCLI name cabalfp target = do (cpGhcOptions . ppCommon <$> M.lookup name smProject) <|> (cpGhcOptions . dpCommon <$> M.lookup name smDeps) + sourceMapCabalConfigOpts = fromMaybe [] $ + (cpCabalConfigOpts . ppCommon <$> M.lookup name smProject) + <|> + (cpCabalConfigOpts . dpCommon <$> M.lookup name smDeps) config = PackageConfig { packageConfigEnableTests = True , packageConfigEnableBenchmarks = True , packageConfigFlags = getLocalFlags buildOptsCLI name , packageConfigGhcOptions = sourceMapGhcOptions + , packageConfigCabalConfigOpts = sourceMapCabalConfigOpts , packageConfigCompilerVersion = compilerVersion , packageConfigPlatform = view platformL econfig } diff --git a/src/Stack/Hoogle.hs b/src/Stack/Hoogle.hs index 712baf58aa..7a5215cad6 100644 --- a/src/Stack/Hoogle.hs +++ b/src/Stack/Hoogle.hs @@ -78,7 +78,7 @@ hoogleCmd (args,setup,rebuild,startServer) = installHoogle :: RIO EnvConfig () installHoogle = do hooglePackageIdentifier <- do - mversion <- getLatestHackageVersion hooglePackageName UsePreferredVersions + mversion <- getLatestHackageVersion YesRequireHackageIndex hooglePackageName UsePreferredVersions -- FIXME For a while, we've been following the logic of -- taking the latest Hoogle version available. However, we diff --git a/src/Stack/Lock.hs b/src/Stack/Lock.hs new file mode 100644 index 0000000000..bb9709a82d --- /dev/null +++ b/src/Stack/Lock.hs @@ -0,0 +1,125 @@ +{-# LANGUAGE FlexibleContexts #-} +{-# LANGUAGE FlexibleInstances #-} +{-# LANGUAGE NoImplicitPrelude #-} +{-# LANGUAGE OverloadedStrings #-} +{-# LANGUAGE RecordWildCards #-} + +module Stack.Lock + ( lockCachedWanted + , LockedLocation(..) + , Locked(..) + ) where + +import Data.Aeson.Extended +import qualified Data.List.NonEmpty as NE +import Data.Map (Map) +import qualified Data.Map as Map +import qualified Data.Yaml as Yaml +import Pantry +import Path (addFileExtension, parent) +import Path.IO (doesFileExist) +import RIO.Process +import Stack.Prelude +import Stack.SourceMap +import Stack.Types.Config +import Stack.Types.SourceMap + +data LockedLocation a b = LockedLocation + { llOriginal :: a + , llCompleted :: b + } deriving (Eq, Show) + +instance (ToJSON a, ToJSON b) => ToJSON (LockedLocation a b) where + toJSON ll = + object [ "original" .= llOriginal ll, "completed" .= llCompleted ll ] + +instance ( FromJSON (WithJSONWarnings (Unresolved a)) + , FromJSON (WithJSONWarnings (Unresolved b)) + ) => + FromJSON (WithJSONWarnings (Unresolved (LockedLocation a b))) where + parseJSON = + withObjectWarnings "LockedLocation" $ \o -> do + original <- jsonSubWarnings $ o ..: "original" + completed <- jsonSubWarnings $ o ..: "completed" + pure $ LockedLocation <$> original <*> completed + +-- Special wrapper extracting only 1 RawPackageLocationImmutable +-- serialization should not produce locations with multiple subdirs +-- so we should be OK using just a head element +newtype SingleRPLI = SingleRPLI { unSingleRPLI :: RawPackageLocationImmutable} + +instance FromJSON (WithJSONWarnings (Unresolved SingleRPLI)) where + parseJSON v = + do + WithJSONWarnings unresolvedRPLIs ws <- parseJSON v + let withWarnings x = WithJSONWarnings x ws + pure $ withWarnings $ SingleRPLI . NE.head <$> unresolvedRPLIs + +data Locked = Locked + { lckSnapshotLocaitons :: [LockedLocation RawSnapshotLocation SnapshotLocation] + , lckPkgImmutableLocations :: [LockedLocation RawPackageLocationImmutable PackageLocationImmutable] + } deriving (Eq, Show) + +instance ToJSON Locked where + toJSON Locked {..} = + object + [ "snapshots" .= lckSnapshotLocaitons + , "packages" .= lckPkgImmutableLocations + ] + +instance FromJSON (WithJSONWarnings (Unresolved Locked)) where + parseJSON = withObjectWarnings "Locked" $ \o -> do + snapshots <- jsonSubWarningsT $ o ..: "snapshots" + packages <- jsonSubWarningsT $ o ..: "packages" + let unwrap ll = ll { llOriginal = unSingleRPLI (llOriginal ll) } + pure $ Locked <$> sequenceA snapshots <*> (map unwrap <$> sequenceA packages) + +loadYamlThrow + :: HasLogFunc env + => (Value -> Yaml.Parser (WithJSONWarnings a)) -> Path Abs File -> RIO env a +loadYamlThrow parser path = do + val <- liftIO $ Yaml.decodeFileThrow (toFilePath path) + case Yaml.parseEither parser val of + Left err -> throwIO $ Yaml.AesonException err + Right (WithJSONWarnings res warnings) -> do + logJSONWarnings (toFilePath path) warnings + return res + +lockCachedWanted :: + (HasPantryConfig env, HasProcessContext env, HasLogFunc env) + => Path Abs File + -> RawSnapshotLocation + -> (Map RawPackageLocationImmutable PackageLocationImmutable + -> WantedCompiler + -> Map PackageName (Bool -> RIO env DepPackage) + -> RIO env ( SMWanted, [CompletedPLI])) + -> RIO env SMWanted +lockCachedWanted stackFile resolver fillWanted = do + lockFile <- liftIO $ addFileExtension "lock" stackFile + lockExists <- doesFileExist lockFile + locked <- + if not lockExists + then do + logDebug "Lock file doesn't exist" + pure $ Locked [] [] + else do + logDebug "Using package location completions from a lock file" + unresolvedLocked <- loadYamlThrow parseJSON lockFile + resolvePaths (Just $ parent stackFile) unresolvedLocked + let toMap :: Ord a => [LockedLocation a b] -> Map a b + toMap = Map.fromList . map (\ll -> (llOriginal ll, llCompleted ll)) + slocCache = toMap $ lckSnapshotLocaitons locked + pkgLocCache = toMap $ lckPkgImmutableLocations locked + (snap, slocCompleted, pliCompleted) <- + loadAndCompleteSnapshotRaw resolver slocCache pkgLocCache + let compiler = snapshotCompiler snap + snPkgs = Map.mapWithKey (\n p h -> snapToDepPackage h n p) (snapshotPackages snap) + (wanted, prjCompleted) <- fillWanted Map.empty compiler snPkgs + let lockLocations = map (uncurry LockedLocation) + newLocked = Locked { lckSnapshotLocaitons = lockLocations slocCompleted + , lckPkgImmutableLocations = + lockLocations $ pliCompleted <> prjCompleted + } + when (newLocked /= locked) $ + liftIO $ Yaml.encodeFile (toFilePath lockFile) newLocked + pure wanted diff --git a/src/Stack/ModuleInterface.hs b/src/Stack/ModuleInterface.hs new file mode 100644 index 0000000000..947c641330 --- /dev/null +++ b/src/Stack/ModuleInterface.hs @@ -0,0 +1,435 @@ +{-# LANGUAGE DerivingStrategies #-} +{-# LANGUAGE GeneralizedNewtypeDeriving #-} + +module Stack.ModuleInterface + ( Interface(..) + , List(..) + , Dictionary(..) + , Module(..) + , Usage(..) + , Dependencies(..) + , getInterface + , fromFile + ) where + +{- HLINT ignore "Reduce duplication" -} + +import Control.Monad (replicateM, replicateM_) +import Data.Binary (Get, Word32) +import Data.Binary.Get (Decoder (..), bytesRead, + getByteString, getInt64be, + getWord32be, getWord64be, + getWord8, lookAhead, + runGetIncremental, skip) +import Data.Bool (bool) +import Data.ByteString.Lazy.Internal (defaultChunkSize) +import Data.Char (chr) +import Data.Functor (void, ($>)) +import Data.List (find) +import Data.Maybe (catMaybes) +import Data.Semigroup ((<>)) +import qualified Data.Vector as V +import GHC.IO.IOMode (IOMode (..)) +import Numeric (showHex) +import RIO.ByteString as B (ByteString, hGetSome, null) +import System.IO (withBinaryFile) + +type IsBoot = Bool + +type ModuleName = ByteString + +newtype List a = List + { unList :: [a] + } deriving newtype (Show) + +newtype Dictionary = Dictionary + { unDictionary :: V.Vector ByteString + } deriving newtype (Show) + +newtype Module = Module + { unModule :: ModuleName + } deriving newtype (Show) + +newtype Usage = Usage + { unUsage :: FilePath + } deriving newtype (Show) + +data Dependencies = Dependencies + { dmods :: List (ModuleName, IsBoot) + , dpkgs :: List (ModuleName, Bool) + , dorphs :: List Module + , dfinsts :: List Module + , dplugins :: List ModuleName + } deriving (Show) + +data Interface = Interface + { deps :: Dependencies + , usage :: List Usage + } deriving (Show) + +-- | Read a block prefixed with its length +withBlockPrefix :: Get a -> Get a +withBlockPrefix f = getWord32be *> f + +getBool :: Get Bool +getBool = toEnum . fromIntegral <$> getWord8 + +getString :: Get String +getString = fmap (chr . fromIntegral) . unList <$> getList getWord32be + +getMaybe :: Get a -> Get (Maybe a) +getMaybe f = bool (pure Nothing) (Just <$> f) =<< getBool + +getList :: Get a -> Get (List a) +getList f = do + i <- getWord8 + l <- + if i == 0xff + then getWord32be + else pure (fromIntegral i :: Word32) + List <$> replicateM (fromIntegral l) f + +getTuple :: Get a -> Get b -> Get (a, b) +getTuple f g = (,) <$> f <*> g + +getByteStringSized :: Get ByteString +getByteStringSized = do + size <- getInt64be + getByteString (fromIntegral size) + +getDictionary :: Int -> Get Dictionary +getDictionary ptr = do + offset <- bytesRead + skip $ ptr - fromIntegral offset + size <- fromIntegral <$> getInt64be + Dictionary <$> V.replicateM size getByteStringSized + +getCachedBS :: Dictionary -> Get ByteString +getCachedBS d = go =<< getWord32be + where + go i = + case unDictionary d V.!? fromIntegral i of + Just bs -> pure bs + Nothing -> fail $ "Invalid dictionary index: " <> show i + +getFP :: Get () +getFP = void $ getWord64be *> getWord64be + +getInterface721 :: Dictionary -> Get Interface +getInterface721 d = do + void getModule + void getBool + replicateM_ 2 getFP + void getBool + void getBool + Interface <$> getDependencies <*> getUsage + where + getModule = getCachedBS d *> (Module <$> getCachedBS d) + getDependencies = + withBlockPrefix $ + Dependencies <$> getList (getTuple (getCachedBS d) getBool) <*> + getList (getTuple (getCachedBS d) getBool) <*> + getList getModule <*> + getList getModule <*> + pure (List []) + getUsage = withBlockPrefix $ List . catMaybes . unList <$> getList go + where + go :: Get (Maybe Usage) + go = do + usageType <- getWord8 + case usageType of + 0 -> getModule *> getFP *> getBool $> Nothing + 1 -> + getCachedBS d *> getFP *> getMaybe getFP *> + getList (getTuple (getWord8 *> getCachedBS d) getFP) *> + getBool $> Nothing + _ -> fail $ "Invalid usageType: " <> show usageType + +getInterface741 :: Dictionary -> Get Interface +getInterface741 d = do + void getModule + void getBool + replicateM_ 3 getFP + void getBool + void getBool + Interface <$> getDependencies <*> getUsage + where + getModule = getCachedBS d *> (Module <$> getCachedBS d) + getDependencies = + withBlockPrefix $ + Dependencies <$> getList (getTuple (getCachedBS d) getBool) <*> + getList (getTuple (getCachedBS d) getBool) <*> + getList getModule <*> + getList getModule <*> + pure (List []) + getUsage = withBlockPrefix $ List . catMaybes . unList <$> getList go + where + go :: Get (Maybe Usage) + go = do + usageType <- getWord8 + case usageType of + 0 -> getModule *> getFP *> getBool $> Nothing + 1 -> + getCachedBS d *> getFP *> getMaybe getFP *> + getList (getTuple (getWord8 *> getCachedBS d) getFP) *> + getBool $> Nothing + 2 -> Just . Usage <$> getString <* getWord64be <* getWord64be + _ -> fail $ "Invalid usageType: " <> show usageType + +getInterface761 :: Dictionary -> Get Interface +getInterface761 d = do + void getModule + void getBool + replicateM_ 3 getFP + void getBool + void getBool + Interface <$> getDependencies <*> getUsage + where + getModule = getCachedBS d *> (Module <$> getCachedBS d) + getDependencies = + withBlockPrefix $ + Dependencies <$> getList (getTuple (getCachedBS d) getBool) <*> + getList (getTuple (getCachedBS d) getBool) <*> + getList getModule <*> + getList getModule <*> + pure (List []) + getUsage = withBlockPrefix $ List . catMaybes . unList <$> getList go + where + go :: Get (Maybe Usage) + go = do + usageType <- getWord8 + case usageType of + 0 -> getModule *> getFP *> getBool $> Nothing + 1 -> + getCachedBS d *> getFP *> getMaybe getFP *> + getList (getTuple (getWord8 *> getCachedBS d) getFP) *> + getBool $> Nothing + 2 -> Just . Usage <$> getString <* getWord64be <* getWord64be + _ -> fail $ "Invalid usageType: " <> show usageType + +getInterface781 :: Dictionary -> Get Interface +getInterface781 d = do + void getModule + void getBool + replicateM_ 3 getFP + void getBool + void getBool + Interface <$> getDependencies <*> getUsage + where + getModule = getCachedBS d *> (Module <$> getCachedBS d) + getDependencies = + withBlockPrefix $ + Dependencies <$> getList (getTuple (getCachedBS d) getBool) <*> + getList (getTuple (getCachedBS d) getBool) <*> + getList getModule <*> + getList getModule <*> + pure (List []) + getUsage = withBlockPrefix $ List . catMaybes . unList <$> getList go + where + go :: Get (Maybe Usage) + go = do + usageType <- getWord8 + case usageType of + 0 -> getModule *> getFP *> getBool $> Nothing + 1 -> + getCachedBS d *> getFP *> getMaybe getFP *> + getList (getTuple (getWord8 *> getCachedBS d) getFP) *> + getBool $> Nothing + 2 -> Just . Usage <$> getString <* getFP + _ -> fail $ "Invalid usageType: " <> show usageType + +getInterface801 :: Dictionary -> Get Interface +getInterface801 d = do + void getModule + void getWord8 + replicateM_ 3 getFP + void getBool + void getBool + Interface <$> getDependencies <*> getUsage + where + getModule = getCachedBS d *> (Module <$> getCachedBS d) + getDependencies = + withBlockPrefix $ + Dependencies <$> getList (getTuple (getCachedBS d) getBool) <*> + getList (getTuple (getCachedBS d) getBool) <*> + getList getModule <*> + getList getModule <*> + pure (List []) + getUsage = withBlockPrefix $ List . catMaybes . unList <$> getList go + where + go :: Get (Maybe Usage) + go = do + usageType <- getWord8 + case usageType of + 0 -> getModule *> getFP *> getBool $> Nothing + 1 -> + getCachedBS d *> getFP *> getMaybe getFP *> + getList (getTuple (getWord8 *> getCachedBS d) getFP) *> + getBool $> Nothing + 2 -> Just . Usage <$> getString <* getFP + 3 -> getModule *> getFP $> Nothing + _ -> fail $ "Invalid usageType: " <> show usageType + +getInterface821 :: Dictionary -> Get Interface +getInterface821 d = do + void getModule + void $ getMaybe getModule + void getWord8 + replicateM_ 3 getFP + void getBool + void getBool + Interface <$> getDependencies <*> getUsage + where + getModule = do + idType <- getWord8 + case idType of + 0 -> void $ getCachedBS d + _ -> + void $ + getCachedBS d *> getList (getTuple (getCachedBS d) getModule) + Module <$> getCachedBS d + getDependencies = + withBlockPrefix $ + Dependencies <$> getList (getTuple (getCachedBS d) getBool) <*> + getList (getTuple (getCachedBS d) getBool) <*> + getList getModule <*> + getList getModule <*> + pure (List []) + getUsage = withBlockPrefix $ List . catMaybes . unList <$> getList go + where + go :: Get (Maybe Usage) + go = do + usageType <- getWord8 + case usageType of + 0 -> getModule *> getFP *> getBool $> Nothing + 1 -> + getCachedBS d *> getFP *> getMaybe getFP *> + getList (getTuple (getWord8 *> getCachedBS d) getFP) *> + getBool $> Nothing + 2 -> Just . Usage <$> getString <* getFP + 3 -> getModule *> getFP $> Nothing + _ -> fail $ "Invalid usageType: " <> show usageType + +getInterface841 :: Dictionary -> Get Interface +getInterface841 d = do + void getModule + void $ getMaybe getModule + void getWord8 + replicateM_ 5 getFP + void getBool + void getBool + Interface <$> getDependencies <*> getUsage + where + getModule = do + idType <- getWord8 + case idType of + 0 -> void $ getCachedBS d + _ -> + void $ + getCachedBS d *> getList (getTuple (getCachedBS d) getModule) + Module <$> getCachedBS d + getDependencies = + withBlockPrefix $ + Dependencies <$> getList (getTuple (getCachedBS d) getBool) <*> + getList (getTuple (getCachedBS d) getBool) <*> + getList getModule <*> + getList getModule <*> + pure (List []) + getUsage = withBlockPrefix $ List . catMaybes . unList <$> getList go + where + go :: Get (Maybe Usage) + go = do + usageType <- getWord8 + case usageType of + 0 -> getModule *> getFP *> getBool $> Nothing + 1 -> + getCachedBS d *> getFP *> getMaybe getFP *> + getList (getTuple (getWord8 *> getCachedBS d) getFP) *> + getBool $> Nothing + 2 -> Just . Usage <$> getString <* getFP + 3 -> getModule *> getFP $> Nothing + _ -> fail $ "Invalid usageType: " <> show usageType + +getInterface861 :: Dictionary -> Get Interface +getInterface861 d = do + void getModule + void $ getMaybe getModule + void getWord8 + replicateM_ 6 getFP + void getBool + void getBool + Interface <$> getDependencies <*> getUsage + where + getModule = do + idType <- getWord8 + case idType of + 0 -> void $ getCachedBS d + _ -> + void $ + getCachedBS d *> getList (getTuple (getCachedBS d) getModule) + Module <$> getCachedBS d + getDependencies = + withBlockPrefix $ + Dependencies <$> getList (getTuple (getCachedBS d) getBool) <*> + getList (getTuple (getCachedBS d) getBool) <*> + getList getModule <*> + getList getModule <*> + getList (getCachedBS d) + getUsage = withBlockPrefix $ List . catMaybes . unList <$> getList go + where + go :: Get (Maybe Usage) + go = do + usageType <- getWord8 + case usageType of + 0 -> getModule *> getFP *> getBool $> Nothing + 1 -> + getCachedBS d *> getFP *> getMaybe getFP *> + getList (getTuple (getWord8 *> getCachedBS d) getFP) *> + getBool $> Nothing + 2 -> Just . Usage <$> getString <* getFP + 3 -> getModule *> getFP $> Nothing + _ -> fail $ "Invalid usageType: " <> show usageType + +getInterface :: Get Interface +getInterface = do + magic <- getWord32be + case magic of + -- x32 + 0x1face -> void getWord32be + -- x64 + 0x1face64 -> void getWord64be + invalidMagic -> fail $ "Invalid magic: " <> showHex invalidMagic "" + -- ghc version + version <- getString + -- way + void getString + -- dict_ptr + dictPtr <- getWord32be + -- dict + dict <- lookAhead $ getDictionary $ fromIntegral dictPtr + -- symtable_ptr + void getWord32be + let versions = + [ ("8061", getInterface861) + , ("8041", getInterface841) + , ("8021", getInterface821) + , ("8001", getInterface801) + , ("7081", getInterface781) + , ("7061", getInterface761) + , ("7041", getInterface741) + , ("7021", getInterface721) + ] + case snd <$> find ((version >=) . fst) versions of + Just f -> f dict + Nothing -> fail $ "Unsupported version: " <> version + +fromFile :: FilePath -> IO (Either String Interface) +fromFile fp = withBinaryFile fp ReadMode go + where + go h = + let feed (Done _ _ iface) = pure $ Right iface + feed (Fail _ _ msg) = pure $ Left msg + feed (Partial k) = do + chunk <- hGetSome h defaultChunkSize + feed $ k $ if B.null chunk then Nothing else Just chunk + in feed $ runGetIncremental getInterface diff --git a/src/Stack/Package.hs b/src/Stack/Package.hs index 81816594f1..82cefa3454 100644 --- a/src/Stack/Package.hs +++ b/src/Stack/Package.hs @@ -1,13 +1,13 @@ -{-# LANGUAGE NoImplicitPrelude #-} -{-# LANGUAGE TupleSections #-} +{-# LANGUAGE DataKinds #-} +{-# LANGUAGE DeriveDataTypeable #-} +{-# LANGUAGE FlexibleContexts #-} +{-# LANGUAGE FlexibleInstances #-} +{-# LANGUAGE NoImplicitPrelude #-} +{-# LANGUAGE OverloadedStrings #-} +{-# LANGUAGE RankNTypes #-} +{-# LANGUAGE RecordWildCards #-} {-# LANGUAGE ScopedTypeVariables #-} -{-# LANGUAGE FlexibleContexts #-} -{-# LANGUAGE DataKinds #-} -{-# LANGUAGE DeriveDataTypeable #-} -{-# LANGUAGE FlexibleInstances #-} -{-# LANGUAGE OverloadedStrings #-} -{-# LANGUAGE RankNTypes #-} -{-# LANGUAGE RecordWildCards #-} +{-# LANGUAGE TupleSections #-} -- | Dealing with Cabal. @@ -27,56 +27,62 @@ module Stack.Package ,applyForceCustomBuild ) where -import qualified Data.ByteString.Lazy.Char8 as CL8 -import Data.List (isPrefixOf, unzip) -import Data.Maybe (maybe) -import qualified Data.Map.Strict as M -import qualified Data.Set as S -import qualified Data.Text as T -import qualified Data.Text.Lazy as TL -import qualified Data.Text.Lazy.Encoding as TLE +import qualified Data.ByteString.Char8 as B8 +import Data.List (find, isPrefixOf, + unzip) +import qualified Data.Map.Strict as M +import Data.Maybe (fromMaybe, maybe) +import qualified Data.Set as S +import qualified Data.Text as T import Distribution.Compiler -import Distribution.ModuleName (ModuleName) -import qualified Distribution.ModuleName as Cabal -import qualified Distribution.Package as D -import Distribution.Package hiding (Package,PackageName,packageName,packageVersion,PackageIdentifier) -import qualified Distribution.PackageDescription as D -import Distribution.PackageDescription hiding (FlagName) +import Distribution.ModuleName (ModuleName) +import qualified Distribution.ModuleName as Cabal +import Distribution.Package hiding (Package, + PackageIdentifier, + PackageName, + packageName, + packageVersion) +import qualified Distribution.Package as D +import Distribution.PackageDescription hiding (FlagName) +import qualified Distribution.PackageDescription as D import Distribution.PackageDescription.Parsec -import Distribution.Simple.Glob (matchDirFileGlob) -import Distribution.System (OS (..), Arch, Platform (..)) -import qualified Distribution.Text as D -import qualified Distribution.Types.CondTree as Cabal -import qualified Distribution.Types.ExeDependency as Cabal +import Distribution.Simple.Glob (matchDirFileGlob) +import Distribution.System (Arch, OS (..), + Platform (..)) +import qualified Distribution.Text as D +import qualified Distribution.Types.CondTree as Cabal +import qualified Distribution.Types.ExeDependency as Cabal import Distribution.Types.ForeignLib import qualified Distribution.Types.LegacyExeDependency as Cabal import Distribution.Types.MungedPackageName import qualified Distribution.Types.UnqualComponentName as Cabal -import qualified Distribution.Verbosity as D -import Distribution.Version (mkVersion, orLaterVersion, anyVersion) -import Path as FL +import qualified Distribution.Verbosity as D +import Distribution.Version (anyVersion, mkVersion, + orLaterVersion) +import Path as FL import Path.Extra -import Path.IO hiding (findFiles) +import Path.IO hiding (findFiles) +import RIO.PrettyPrint +import qualified RIO.PrettyPrint as PP (Style (Module)) +import RIO.Process import Stack.Build.Installed import Stack.Constants import Stack.Constants.Config -import Stack.Prelude hiding (Display (..)) +import qualified Stack.ModuleInterface as Iface +import Stack.Prelude hiding (Display (..)) import Stack.Types.Compiler import Stack.Types.Config import Stack.Types.GhcPkgId import Stack.Types.NamedComponent import Stack.Types.Package import Stack.Types.Version -import qualified System.Directory as D -import System.FilePath (replaceExtension) -import qualified System.FilePath as FilePath +import qualified System.Directory as D +import System.FilePath (replaceExtension) +import qualified System.FilePath as FilePath import System.IO.Error -import RIO.Process -import RIO.PrettyPrint -import qualified RIO.PrettyPrint as PP (Style (Module)) -data Ctx = Ctx { ctxFile :: !(Path Abs File) - , ctxDistDir :: !(Path Abs Dir) +data Ctx = Ctx { ctxFile :: !(Path Abs File) + , ctxDistDir :: !(Path Abs Dir) , ctxBuildConfig :: !BuildConfig } @@ -135,6 +141,7 @@ packageFromPackageDescription packageConfig pkgFlags (PackageDescriptionPair pkg , packageFiles = pkgFiles , packageUnknownTools = unknownTools , packageGhcOptions = packageConfigGhcOptions packageConfig + , packageCabalConfigOpts = packageConfigCabalConfigOpts packageConfig , packageFlags = packageConfigFlags packageConfig , packageDefaultFlags = M.fromList [(flagName flag, flagDefault flag) | flag <- pkgFlags] @@ -147,7 +154,7 @@ packageFromPackageDescription packageConfig pkgFlags (PackageDescriptionPair pkg in case mlib of Nothing -> NoLibraries - Just _ -> HasLibraries foreignLibNames + Just _ -> HasLibraries foreignLibNames , packageInternalLibraries = subLibNames , packageTests = M.fromList [(T.pack (Cabal.unUnqualComponentName $ testName t), testInterface t) @@ -331,18 +338,18 @@ generatePkgDescOpts installMap installedMap omitPkgs addPkgs cabalfp pkg compone -- | Input to 'generateBuildInfoOpts' data BioInput = BioInput - { biInstallMap :: !InstallMap - , biInstalledMap :: !InstalledMap - , biCabalDir :: !(Path Abs Dir) - , biDistDir :: !(Path Abs Dir) - , biOmitPackages :: ![PackageName] - , biAddPackages :: ![PackageName] - , biBuildInfo :: !BuildInfo - , biDotCabalPaths :: ![DotCabalPath] - , biConfigLibDirs :: ![FilePath] + { biInstallMap :: !InstallMap + , biInstalledMap :: !InstalledMap + , biCabalDir :: !(Path Abs Dir) + , biDistDir :: !(Path Abs Dir) + , biOmitPackages :: ![PackageName] + , biAddPackages :: ![PackageName] + , biBuildInfo :: !BuildInfo + , biDotCabalPaths :: ![DotCabalPath] + , biConfigLibDirs :: ![FilePath] , biConfigIncludeDirs :: ![FilePath] - , biComponentName :: !NamedComponent - , biCabalVersion :: !Version + , biComponentName :: !NamedComponent + , biCabalVersion :: !Version } -- | Generate GHC options for the target. Since Cabal also figures out @@ -389,7 +396,7 @@ generateBuildInfoOpts BioInput {..} = ghcOpts = concatMap snd . filter (isGhc . fst) $ options biBuildInfo where isGhc GHC = True - isGhc _ = False + isGhc _ = False extOpts = map (("-X" ++) . D.display) (usedExtensions biBuildInfo) srcOpts = map (("-i" <>) . toFilePathNoTrailingSep) @@ -405,7 +412,7 @@ generateBuildInfoOpts BioInput {..} = ]) ++ [ "-stubdir=" ++ toFilePathNoTrailingSep (buildDir biDistDir) ] componentAutogen = componentAutogenDir biCabalVersion biComponentName biDistDir - toIncludeDir "." = Just biCabalDir + toIncludeDir "." = Just biCabalDir toIncludeDir relDir = concatAndColapseAbsDir biCabalDir relDir includeOpts = map ("-I" <>) (biConfigIncludeDirs <> pkgIncludeOpts) @@ -484,21 +491,21 @@ componentBuildDir cabalVer component distDir | cabalVer < mkVersion [2, 0] = buildDir distDir | otherwise = case component of - CLib -> buildDir distDir + CLib -> buildDir distDir CInternalLib name -> buildDir distDir componentNameToDir name - CExe name -> buildDir distDir componentNameToDir name - CTest name -> buildDir distDir componentNameToDir name - CBench name -> buildDir distDir componentNameToDir name + CExe name -> buildDir distDir componentNameToDir name + CTest name -> buildDir distDir componentNameToDir name + CBench name -> buildDir distDir componentNameToDir name -- | The directory where generated files are put like .o or .hs (from .x files). componentOutputDir :: NamedComponent -> Path Abs Dir -> Path Abs Dir componentOutputDir namedComponent distDir = case namedComponent of - CLib -> buildDir distDir + CLib -> buildDir distDir CInternalLib name -> makeTmp name - CExe name -> makeTmp name - CTest name -> makeTmp name - CBench name -> makeTmp name + CExe name -> makeTmp name + CTest name -> makeTmp name + CBench name -> makeTmp name where makeTmp name = buildDir distDir componentNameToDir (name <> "/" <> name <> "-tmp") @@ -736,7 +743,7 @@ benchmarkFiles component bench = do names = bnames <> exposed exposed = case benchmarkInterface bench of - BenchmarkExeV10 _ fp -> [DotCabalMain fp] + BenchmarkExeV10 _ fp -> [DotCabalMain fp] BenchmarkUnsupported _ -> [] bnames = map DotCabalModule (otherModules build) build = benchmarkBuildInfo bench @@ -752,8 +759,8 @@ testFiles component test = do names = bnames <> exposed exposed = case testInterface test of - TestSuiteExeV10 _ fp -> [DotCabalMain fp] - TestSuiteLibV09 _ mn -> [DotCabalModule mn] + TestSuiteExeV10 _ fp -> [DotCabalMain fp] + TestSuiteLibV09 _ mn -> [DotCabalModule mn] TestSuiteUnsupported _ -> [] bnames = map DotCabalModule (otherModules build) build = testBuildInfo test @@ -841,7 +848,7 @@ targetJsSources = jsSources -- moment. Odds are, you're reading this in the year 2024 and thinking -- "wtf?" data PackageDescriptionPair = PackageDescriptionPair - { pdpOrigBuildable :: PackageDescription + { pdpOrigBuildable :: PackageDescription , pdpModifiedBuildable :: PackageDescription } @@ -929,10 +936,10 @@ flagMap = M.fromList . map pair pair = flagName &&& flagDefault data ResolveConditions = ResolveConditions - { rcFlags :: Map FlagName Bool + { rcFlags :: Map FlagName Bool , rcCompilerVersion :: ActualCompiler - , rcOS :: OS - , rcArch :: Arch + , rcOS :: OS + , rcArch :: Arch } -- | Generic a @ResolveConditions@ using sensible defaults. @@ -1015,7 +1022,7 @@ resolveFilesAndDeps component dirs names0 = do let foundFiles = mapMaybe snd resolved foundModules = mapMaybe toResolvedModule resolved missingModules = mapMaybe toMissingModule resolved - pairs <- mapM (getDependencies component) foundFiles + pairs <- mapM (getDependencies component dirs) foundFiles let doneModules = S.union doneModules0 @@ -1076,81 +1083,61 @@ resolveFilesAndDeps component dirs names0 = do -- | Get the dependencies of a Haskell module file. getDependencies - :: NamedComponent -> DotCabalPath -> RIO Ctx (Set ModuleName, [Path Abs File]) -getDependencies component dotCabalPath = + :: NamedComponent -> [Path Abs Dir] -> DotCabalPath -> RIO Ctx (Set ModuleName, [Path Abs File]) +getDependencies component dirs dotCabalPath = case dotCabalPath of DotCabalModulePath resolvedFile -> readResolvedHi resolvedFile - DotCabalMainPath resolvedFile -> readResolvedHi resolvedFile - DotCabalFilePath{} -> return (S.empty, []) - DotCabalCFilePath{} -> return (S.empty, []) + DotCabalMainPath resolvedFile -> readResolvedHi resolvedFile + DotCabalFilePath{} -> return (S.empty, []) + DotCabalCFilePath{} -> return (S.empty, []) where readResolvedHi resolvedFile = do dumpHIDir <- componentOutputDir component <$> asks ctxDistDir dir <- asks (parent . ctxFile) - case stripProperPrefix dir resolvedFile of + let sourceDir = fromMaybe dir $ find (`isProperPrefixOf` resolvedFile) dirs + stripSourceDir d = stripProperPrefix d resolvedFile + case stripSourceDir sourceDir of Nothing -> return (S.empty, []) Just fileRel -> do - let dumpHIPath = + let hiPath = FilePath.replaceExtension (toFilePath (dumpHIDir fileRel)) - ".dump-hi" - dumpHIExists <- liftIO $ D.doesFileExist dumpHIPath + ".hi" + dumpHIExists <- liftIO $ D.doesFileExist hiPath if dumpHIExists - then parseDumpHI dumpHIPath + then parseHI hiPath else return (S.empty, []) --- | Parse a .dump-hi file into a set of modules and files. -parseDumpHI +-- | Parse a .hi file into a set of modules and files. +parseHI :: FilePath -> RIO Ctx (Set ModuleName, [Path Abs File]) -parseDumpHI dumpHIPath = do - dir <- asks (parent . ctxFile) - dumpHI <- liftIO $ filterDumpHi <$> fmap CL8.lines (CL8.readFile dumpHIPath) - let startModuleDeps = - dropWhile (not . ("module dependencies:" `CL8.isPrefixOf`)) dumpHI - moduleDeps = - S.fromList $ - mapMaybe (D.simpleParse . TL.unpack . TLE.decodeUtf8) $ - CL8.words $ - CL8.concat $ - CL8.dropWhile (/= ' ') (fromMaybe "" $ listToMaybe startModuleDeps) : - takeWhile (" " `CL8.isPrefixOf`) (drop 1 startModuleDeps) - thDeps = - -- The dependent file path is surrounded by quotes but is not escaped. - -- It can be an absolute or relative path. - TL.unpack . - -- Starting with GHC 8.4.3, there's a hash following - -- the path. See - -- https://github.com/yesodweb/yesod/issues/1551 - TLE.decodeUtf8 . - CL8.takeWhile (/= '\"') <$> - mapMaybe (CL8.stripPrefix "addDependentFile \"") dumpHI - thDepsResolved <- liftM catMaybes $ forM thDeps $ \x -> do - mresolved <- liftIO (forgivingAbsence (resolveFile dir x)) >>= rejectMissingFile - when (isNothing mresolved) $ - prettyWarnL - [ flow "addDependentFile path (Template Haskell) listed in" - , style File $ fromString dumpHIPath - , flow "does not exist:" - , style File $ fromString x - ] - return mresolved - return (moduleDeps, thDepsResolved) - where - -- | Filtering step fixing RAM usage upon a big dump-hi file. See - -- https://github.com/commercialhaskell/stack/issues/4027 It is - -- an optional step from a functionality stand-point. - filterDumpHi dumpHI = - let dl x xs = x ++ xs - isLineInteresting (acc, moduleDepsStarted) l - | moduleDepsStarted && " " `CL8.isPrefixOf` l = - (acc . dl [l], True) - | "module dependencies:" `CL8.isPrefixOf` l = - (acc . dl [l], True) - | "addDependentFile \"" `CL8.isPrefixOf` l = - (acc . dl [l], False) - | otherwise = (acc, False) - in fst (foldl' isLineInteresting (dl [], False) dumpHI) [] - +parseHI hiPath = do + dir <- asks (parent . ctxFile) + result <- liftIO $ Iface.fromFile hiPath + case result of + Left msg -> do + prettyWarnL + [ flow "Failed to decode module interface:" + , style File $ fromString hiPath + , flow "Decoding failure:" + , style Error $ fromString msg + ] + pure (S.empty, []) + Right iface -> do + let moduleNames = fmap (fromString . B8.unpack . fst) . Iface.unList . Iface.dmods . Iface.deps + resolveFileDependency file = do + resolved <- liftIO (forgivingAbsence (resolveFile dir file)) >>= rejectMissingFile + when (isNothing resolved) $ + prettyWarnL + [ flow "Dependent file listed in:" + , style File $ fromString hiPath + , flow "does not exist:" + , style File $ fromString file + ] + pure resolved + resolveUsages = traverse (resolveFileDependency . Iface.unUsage) . Iface.unList . Iface.usage + resolvedUsages <- catMaybes <$> resolveUsages iface + pure (S.fromList $ moduleNames iface, resolvedUsages) -- | Try to resolve the list of base names in the given directory by -- looking for unique instances of base names applied with the given @@ -1178,7 +1165,7 @@ parsePackageNameFromFilePath fp = do base <- clean $ toFilePath $ filename fp case parsePackageName base of Nothing -> throwM $ CabalFileNameInvalidPackageName $ toFilePath fp - Just x -> return x + Just x -> return x where clean = liftM reverse . strip . reverse strip ('l':'a':'b':'a':'c':'.':xs) = return xs strip _ = throwM (CabalFileNameParseFail (toFilePath fp)) @@ -1207,9 +1194,9 @@ findCandidate dirs name = do cons = case name of DotCabalModule{} -> DotCabalModulePath - DotCabalMain{} -> DotCabalMainPath - DotCabalFile{} -> DotCabalFilePath - DotCabalCFile{} -> DotCabalCFilePath + DotCabalMain{} -> DotCabalMainPath + DotCabalFile{} -> DotCabalFilePath + DotCabalCFile{} -> DotCabalCFilePath paths_pkg pkg = "Paths_" ++ packageNameString pkg makeNameCandidates = liftM (nubOrd . concat) (mapM makeDirCandidates dirs) @@ -1235,7 +1222,7 @@ findCandidate dirs name = do ([_], [y]) -> [y] -- Otherwise, return everything - (xs, ys) -> xs ++ ys + (xs, ys) -> xs ++ ys resolveCandidate dir = fmap maybeToList . resolveDirFile dir -- | Resolve file as a child of a specified directory, symlinks @@ -1263,9 +1250,9 @@ warnMultiple name candidate rest = , dispOne candidate ] where showName (DotCabalModule name') = D.display name' - showName (DotCabalMain fp) = fp - showName (DotCabalFile fp) = fp - showName (DotCabalCFile fp) = fp + showName (DotCabalMain fp) = fp + showName (DotCabalFile fp) = fp + showName (DotCabalCFile fp) = fp dispOne = fromString . toFilePath -- TODO: figure out why dispOne can't be just `display` -- (remove the .hlint.yaml exception if it can be) diff --git a/src/Stack/Runners.hs b/src/Stack/Runners.hs index bb16383948..9f2a0be259 100644 --- a/src/Stack/Runners.hs +++ b/src/Stack/Runners.hs @@ -21,6 +21,7 @@ module Stack.Runners import Stack.Prelude import RIO.Process (mkDefaultProcessContext) +import RIO.Time (addUTCTime, getCurrentTime) import Stack.Build.Target(NeedTargets(..)) import Stack.Config import Stack.Constants @@ -28,9 +29,11 @@ import Stack.DefaultColorWhen (defaultColorWhen) import qualified Stack.Docker as Docker import qualified Stack.Nix as Nix import Stack.Setup +import Stack.Storage (upgradeChecksSince, logUpgradeCheck) import Stack.Types.Config import Stack.Types.Docker (dockerEnable) import Stack.Types.Nix (nixEnable) +import Stack.Types.Version (stackMinorVersion, stackVersion, minorVersion) import System.Console.ANSI (hSupportsANSIWithoutEmulation) import System.Terminal (getTerminalWidth) @@ -94,7 +97,11 @@ withConfig shouldReexec inner = -- happen ASAP but needs a configuration. view (globalOptsL.to globalDockerEntrypoint) >>= traverse_ (Docker.entrypoint config) - runRIO config $ + runRIO config $ do + -- Catching all exceptions here, since we don't want this + -- check to ever cause Stack to stop working + shouldUpgradeCheck `catchAny` \e -> + logError ("Error when running shouldUpgradeCheck: " <> displayShow e) case shouldReexec of YesReexec -> reexec inner NoReexec -> inner @@ -169,3 +176,34 @@ withRunnerGlobal go inner = do | w < minTerminalWidth = minTerminalWidth | w > maxTerminalWidth = maxTerminalWidth | otherwise = w + +-- | Check if we should recommend upgrading Stack and, if so, recommend it. +shouldUpgradeCheck :: RIO Config () +shouldUpgradeCheck = do + config <- ask + when (configRecommendUpgrade config) $ do + now <- getCurrentTime + let yesterday = addUTCTime (-24 * 60 * 60) now + checks <- upgradeChecksSince yesterday + when (checks == 0) $ do + mversion <- getLatestHackageVersion NoRequireHackageIndex "stack" UsePreferredVersions + case mversion of + -- Compare the minor version so we avoid patch-level, Hackage-only releases. + -- See: https://github.com/commercialhaskell/stack/pull/4729#pullrequestreview-227176315 + Just (PackageIdentifierRevision _ version _) | minorVersion version > stackMinorVersion -> do + logWarn "<<<<<<<<<<<<<<<<<<" + logWarn $ + "You are currently using Stack version " <> + fromString (versionString stackVersion) <> + ", but version " <> + fromString (versionString version) <> + " is available" + logWarn "You can try to upgrade by running 'stack upgrade'" + logWarn $ + "Tired of seeing this? Add 'recommend-stack-upgrade: false' to " <> + fromString (toFilePath (configUserConfigPath config)) + logWarn ">>>>>>>>>>>>>>>>>>" + logWarn "" + logWarn "" + _ -> pure () + logUpgradeCheck now diff --git a/src/Stack/SDist.hs b/src/Stack/SDist.hs index 03cc46fc2f..4e2a994cfe 100644 --- a/src/Stack/SDist.hs +++ b/src/Stack/SDist.hs @@ -511,6 +511,7 @@ getDefaultPackageConfig = do , packageConfigEnableBenchmarks = False , packageConfigFlags = mempty , packageConfigGhcOptions = [] + , packageConfigCabalConfigOpts = [] , packageConfigCompilerVersion = compilerVersion , packageConfigPlatform = platform } diff --git a/src/Stack/SourceMap.hs b/src/Stack/SourceMap.hs index f6c1c67500..6e8070e219 100644 --- a/src/Stack/SourceMap.hs +++ b/src/Stack/SourceMap.hs @@ -56,6 +56,7 @@ mkProjectPackage printWarnings dir buildHaddocks = do , cpName = name , cpFlags = mempty , cpGhcOptions = mempty + , cpCabalConfigOpts = mempty , cpHaddocks = buildHaddocks } } @@ -86,6 +87,7 @@ additionalDepPackage buildHaddocks pl = do , cpName = name , cpFlags = mempty , cpGhcOptions = mempty + , cpCabalConfigOpts = mempty , cpHaddocks = buildHaddocks } } @@ -107,6 +109,7 @@ snapToDepPackage buildHaddocks name SnapshotPackage{..} = do , cpName = name , cpFlags = spFlags , cpGhcOptions = spGhcOptions + , cpCabalConfigOpts = [] -- No spCabalConfigOpts, not present in snapshots , cpHaddocks = buildHaddocks } } @@ -138,8 +141,7 @@ globalsFromHints :: => WantedCompiler -> RIO env (Map PackageName Version) globalsFromHints compiler = do - ghfp <- globalHintsFile - mglobalHints <- loadGlobalHints ghfp compiler + mglobalHints <- loadGlobalHints compiler case mglobalHints of Just hints -> pure hints Nothing -> do @@ -260,7 +262,7 @@ loadProjectSnapshotCandidate :: -> Bool -> RIO env (SnapshotCandidate env) loadProjectSnapshotCandidate loc printWarnings buildHaddocks = do - snapshot <- fmap fst . loadAndCompleteSnapshot =<< completeSnapshotLocation loc + (snapshot, _, _) <- loadAndCompleteSnapshotRaw loc Map.empty Map.empty deps <- Map.traverseWithKey (snapToDepPackage False) (snapshotPackages snapshot) let wc = snapshotCompiler snapshot globals <- Map.map GlobalPackageVersion <$> globalsFromHints wc diff --git a/src/Stack/Storage.hs b/src/Stack/Storage.hs index ac1a1da35c..eceb353e29 100644 --- a/src/Stack/Storage.hs +++ b/src/Stack/Storage.hs @@ -28,6 +28,8 @@ module Stack.Storage , saveDockerImageExeCache , loadCompilerPaths , saveCompilerPaths + , upgradeChecksSince + , logUpgradeCheck ) where import qualified Data.ByteString as S @@ -151,6 +153,12 @@ CompilerCache globalDump Text UniqueCompilerInfo ghcPath + +-- Last time certain actions were performed +LastPerformed + action Action + timestamp UTCTime + UniqueAction action |] -- | Initialize the database. @@ -544,3 +552,16 @@ saveCompilerPaths CompilerPaths {..} = withStorage $ do , compilerCacheGlobalDump = tshow cpGlobalDump , compilerCacheArch = T.pack $ Distribution.Text.display cpArch } + +-- | How many upgrade checks have occurred since the given timestamp? +upgradeChecksSince :: HasConfig env => UTCTime -> RIO env Int +upgradeChecksSince since = withStorage $ count + [ LastPerformedAction ==. UpgradeCheck + , LastPerformedTimestamp >=. since + ] + +-- | Log in the database that an upgrade check occurred at the given time. +logUpgradeCheck :: HasConfig env => UTCTime -> RIO env () +logUpgradeCheck time = withStorage $ void $ upsert + (LastPerformed UpgradeCheck time) + [LastPerformedTimestamp =. time] diff --git a/src/Stack/Types/Build.hs b/src/Stack/Types/Build.hs index 7a384decce..9b8d0674c1 100644 --- a/src/Stack/Types/Build.hs +++ b/src/Stack/Types/Build.hs @@ -124,6 +124,7 @@ data StackBuildException | TestSuiteExeMissing Bool String String String | CabalCopyFailed Bool String | LocalPackagesPresent [PackageIdentifier] + | CouldNotLockDistDir !(Path Abs File) deriving Typeable data FlagSource = FSCommandLine | FSStackYaml @@ -310,6 +311,11 @@ instance Show StackBuildException where show (LocalPackagesPresent locals) = unlines $ "Local packages are not allowed when using the script command. Packages found:" : map (\ident -> "- " ++ packageIdentifierString ident) locals + show (CouldNotLockDistDir lockFile) = unlines + [ "Locking the dist directory failed, try to lock file:" + , " " ++ toFilePath lockFile + , "Maybe you're running another copy of Stack?" + ] missingExeError :: Bool -> String -> String missingExeError isSimpleBuildType msg = @@ -625,6 +631,7 @@ configureOptsNoDir econfig bco deps isLocal package = concat else "-") <> flagNameString name) (Map.toList flags) + , map T.unpack $ packageCabalConfigOpts package , concatMap (\x -> [compilerOptionsCabalFlag wc, T.unpack x]) (packageGhcOptions package) , map ("--extra-include-dirs=" ++) (configExtraIncludeDirs config) , map ("--extra-lib-dirs=" ++) (configExtraLibDirs config) diff --git a/src/Stack/Types/Cache.hs b/src/Stack/Types/Cache.hs index 752d4c84ab..70087213da 100644 --- a/src/Stack/Types/Cache.hs +++ b/src/Stack/Types/Cache.hs @@ -3,6 +3,7 @@ module Stack.Types.Cache ( ConfigCacheType(..) + , Action(..) ) where import qualified Data.Text as T @@ -43,3 +44,13 @@ instance PersistField ConfigCacheType where instance PersistFieldSql ConfigCacheType where sqlType _ = SqlString + +data Action + = UpgradeCheck + deriving (Show, Eq, Ord) +instance PersistField Action where + toPersistValue UpgradeCheck = PersistInt64 1 + fromPersistValue (PersistInt64 1) = Right UpgradeCheck + fromPersistValue x = Left $ T.pack $ "Invalid Action: " ++ show x +instance PersistFieldSql Action where + sqlType _ = SqlInt64 diff --git a/src/Stack/Types/Config.hs b/src/Stack/Types/Config.hs index 68df7e9bdc..515a047067 100644 --- a/src/Stack/Types/Config.hs +++ b/src/Stack/Types/Config.hs @@ -53,7 +53,6 @@ module Stack.Types.Config ,parseGHCVariant ,HasGHCVariant(..) ,snapshotsDir - ,globalHintsFile -- ** EnvConfig & HasEnvConfig ,EnvConfig(..) ,HasSourceMap(..) @@ -62,6 +61,8 @@ module Stack.Types.Config -- * Details -- ** ApplyGhcOptions ,ApplyGhcOptions(..) + -- ** CabalConfigKey + ,CabalConfigKey(..) -- ** ConfigException ,HpackExecutable(..) ,ConfigException(..) @@ -321,6 +322,8 @@ data Config = -- ^ Additional GHC options to apply to specific packages. ,configGhcOptionsByCat :: !(Map ApplyGhcOptions [Text]) -- ^ Additional GHC options to apply to categories of packages + ,configCabalConfigOpts :: !(Map CabalConfigKey [Text]) + -- ^ Additional options to be passed to ./Setup.hs configure ,configSetupInfoLocations :: ![SetupInfoLocation] -- ^ Additional SetupInfo (inline or remote) to use to find tools. ,configPvpBounds :: !PvpBounds @@ -362,6 +365,8 @@ data Config = -- ^ Database connection pool for Stack database ,configHideSourcePaths :: !Bool -- ^ Enable GHC hiding source paths? + ,configRecommendUpgrade :: !Bool + -- ^ Recommend a Stack upgrade? } -- | The project root directory, if in a project. @@ -372,6 +377,27 @@ configProjectRoot c = PCGlobalProject -> Nothing PCNoProject _deps -> Nothing +-- | Which packages do configure opts apply to? +data CabalConfigKey + = CCKTargets -- ^ See AGOTargets + | CCKLocals -- ^ See AGOLocals + | CCKEverything -- ^ See AGOEverything + | CCKPackage !PackageName -- ^ A specific package + deriving (Show, Read, Eq, Ord) +instance FromJSON CabalConfigKey where + parseJSON = withText "CabalConfigKey" parseCabalConfigKey +instance FromJSONKey CabalConfigKey where + fromJSONKey = FromJSONKeyTextParser parseCabalConfigKey + +parseCabalConfigKey :: Monad m => Text -> m CabalConfigKey +parseCabalConfigKey "$targets" = pure CCKTargets +parseCabalConfigKey "$locals" = pure CCKLocals +parseCabalConfigKey "$everything" = pure CCKEverything +parseCabalConfigKey name = + case parsePackageName $ T.unpack name of + Nothing -> fail $ "Invalid CabalConfigKey: " ++ show name + Just x -> pure $ CCKPackage x + -- | Which packages do ghc-options on the command line apply to? data ApplyGhcOptions = AGOTargets -- ^ all local targets | AGOLocals -- ^ all local packages, even non-targets @@ -733,6 +759,8 @@ data ConfigMonoid = -- ^ See 'configGhcOptionsAll'. Uses 'Monoid.Dual' so that options -- from the configs on the right come first, so that they can be -- overridden. + ,configMonoidCabalConfigOpts :: !(MonoidMap CabalConfigKey (Monoid.Dual [Text])) + -- ^ See 'configCabalConfigOpts'. ,configMonoidExtraPath :: ![Path Abs Dir] -- ^ Additional paths to search for executables in ,configMonoidSetupInfoLocations :: ![SetupInfoLocation] @@ -768,6 +796,8 @@ data ConfigMonoid = , configMonoidStyles :: !StylesUpdate , configMonoidHideSourcePaths :: !FirstTrue -- ^ See 'configHideSourcePaths' + , configMonoidRecommendUpgrade :: !FirstTrue + -- ^ See 'configRecommendUpgrade' } deriving (Show, Generic) @@ -855,6 +885,9 @@ parseConfigMonoidObject rootDir obj = do configMonoidGhcOptionsByName = coerce $ Map.fromList [(name, opts) | (GOKPackage name, opts) <- Map.toList options] + configMonoidCabalConfigOpts' <- obj ..:? "configure-options" ..!= mempty + let configMonoidCabalConfigOpts = coerce (configMonoidCabalConfigOpts' :: Map CabalConfigKey [Text]) + configMonoidExtraPath <- obj ..:? configMonoidExtraPathName ..!= [] configMonoidSetupInfoLocations <- maybeToList <$> jsonSubWarningsT (obj ..:? configMonoidSetupInfoLocationsName) @@ -884,6 +917,7 @@ parseConfigMonoidObject rootDir obj = do <|> configMonoidStylesGB configMonoidHideSourcePaths <- FirstTrue <$> obj ..:? configMonoidHideSourcePathsName + configMonoidRecommendUpgrade <- FirstTrue <$> obj ..:? configMonoidRecommendUpgradeName return ConfigMonoid {..} where @@ -1038,6 +1072,9 @@ configMonoidStylesGBName = "stack-colours" configMonoidHideSourcePathsName :: Text configMonoidHideSourcePathsName = "hide-source-paths" +configMonoidRecommendUpgradeName :: Text +configMonoidRecommendUpgradeName = "recommend-stack-upgrade" + data ConfigException = ParseConfigFileException (Path Abs File) ParseException | ParseCustomSnapshotException Text ParseException @@ -1209,12 +1246,6 @@ snapshotsDir = do platform <- platformGhcRelDir return $ root relDirSnapshots platform --- | Cached global hints file -globalHintsFile :: (MonadReader env m, HasConfig env) => m (Path Abs File) -globalHintsFile = do - root <- view stackRootL - pure $ root relDirGlobalHints relFileGlobalHintsYaml - -- | Installation root for dependencies installationRootDeps :: (HasEnvConfig env) => RIO env (Path Abs Dir) installationRootDeps = do diff --git a/src/Stack/Types/Package.hs b/src/Stack/Types/Package.hs index ddea4e3080..69e23b33b3 100644 --- a/src/Stack/Types/Package.hs +++ b/src/Stack/Types/Package.hs @@ -102,6 +102,7 @@ data Package = ,packageUnknownTools :: !(Set ExeName) -- ^ Build tools specified in the legacy manner (build-tools:) that failed the hard-coded lookup. ,packageAllDeps :: !(Set PackageName) -- ^ Original dependencies (not sieved). ,packageGhcOptions :: ![Text] -- ^ Ghc options used on package. + ,packageCabalConfigOpts :: ![Text] -- ^ Additional options passed to ./Setup.hs configure ,packageFlags :: !(Map FlagName Bool) -- ^ Flags used on package. ,packageDefaultFlags :: !(Map FlagName Bool) -- ^ Defaults for unspecified flags. ,packageLibraries :: !PackageLibraries -- ^ does the package have a buildable library stanza? @@ -216,6 +217,7 @@ data PackageConfig = ,packageConfigEnableBenchmarks :: !Bool -- ^ Are benchmarks enabled? ,packageConfigFlags :: !(Map FlagName Bool) -- ^ Configured flags. ,packageConfigGhcOptions :: ![Text] -- ^ Configured ghc options. + ,packageConfigCabalConfigOpts :: ![Text] -- ^ ./Setup.hs configure options ,packageConfigCompilerVersion :: ActualCompiler -- ^ GHC version ,packageConfigPlatform :: !Platform -- ^ host platform } diff --git a/src/Stack/Types/SourceMap.hs b/src/Stack/Types/SourceMap.hs index 56b712ff61..bda2a8eeb8 100644 --- a/src/Stack/Types/SourceMap.hs +++ b/src/Stack/Types/SourceMap.hs @@ -38,6 +38,7 @@ data CommonPackage = CommonPackage , cpFlags :: !(Map FlagName Bool) -- ^ overrides default flags , cpGhcOptions :: ![Text] -- also lets us know if we're doing profiling + , cpCabalConfigOpts :: ![Text] , cpHaddocks :: !Bool } diff --git a/src/Stack/Unpack.hs b/src/Stack/Unpack.hs index 0780ec1d9b..5c5b5f9ad6 100644 --- a/src/Stack/Unpack.hs +++ b/src/Stack/Unpack.hs @@ -69,14 +69,14 @@ unpackPackages mSnapshot dest input = do toLocNoSnapshot :: PackageName -> RIO env (Either String (PackageLocationImmutable, PackageIdentifier)) toLocNoSnapshot name = do - mloc1 <- getLatestHackageLocation name UsePreferredVersions + mloc1 <- getLatestHackageLocation YesRequireHackageIndex name UsePreferredVersions mloc <- case mloc1 of Just _ -> pure mloc1 Nothing -> do updated <- updateHackageIndex $ Just $ "Could not find package " <> fromString (packageNameString name) <> ", updating" case updated of - UpdateOccurred -> getLatestHackageLocation name UsePreferredVersions + UpdateOccurred -> getLatestHackageLocation YesRequireHackageIndex name UsePreferredVersions NoUpdateOccurred -> pure Nothing case mloc of Nothing -> do diff --git a/src/Stack/Upgrade.hs b/src/Stack/Upgrade.hs index 8ff7d58a5d..42df0facf2 100644 --- a/src/Stack/Upgrade.hs +++ b/src/Stack/Upgrade.hs @@ -210,7 +210,7 @@ sourceUpgrade builtHash (SourceOpts gitRepo) = Nothing -> withConfig NoReexec $ do void $ updateHackageIndex $ Just "Updating index to make sure we find the latest Stack version" - mversion <- getLatestHackageVersion "stack" UsePreferredVersions + mversion <- getLatestHackageVersion YesRequireHackageIndex "stack" UsePreferredVersions (PackageIdentifierRevision _ version _) <- case mversion of Nothing -> throwString "No stack found in package indices" @@ -223,7 +223,7 @@ sourceUpgrade builtHash (SourceOpts gitRepo) = else do suffix <- parseRelDir $ "stack-" ++ versionString version let dir = tmp suffix - mrev <- getLatestHackageRevision "stack" version + mrev <- getLatestHackageRevision YesRequireHackageIndex "stack" version case mrev of Nothing -> throwString "Latest version with no revision" Just (_rev, cfKey, treeKey) -> do diff --git a/src/main/BuildInfo.hs b/src/main/BuildInfo.hs new file mode 100644 index 0000000000..8cb480d469 --- /dev/null +++ b/src/main/BuildInfo.hs @@ -0,0 +1,83 @@ +{-# LANGUAGE CPP #-} +{-# LANGUAGE NoImplicitPrelude #-} +{-# LANGUAGE OverloadedStrings #-} + +#ifdef USE_GIT_INFO +{-# LANGUAGE TemplateHaskell #-} +#endif + +-- Extracted from Main so that the Main module does not use CPP or TH, +-- and therefore doesn't need to be recompiled as often. +module BuildInfo + ( versionString' + , maybeGitHash + , hpackVersion + ) where + +import Stack.Prelude +import qualified Paths_stack as Meta +import qualified Distribution.Text as Cabal (display) +import Distribution.System (buildArch) + +#ifndef HIDE_DEP_VERSIONS +import qualified Build_stack +#endif + +#ifdef USE_GIT_INFO +import GitHash (giCommitCount, giHash, tGitInfoCwdTry) +#endif + +#ifdef USE_GIT_INFO +import Options.Applicative.Simple (simpleVersion) +#endif + +versionString' :: String +#ifdef USE_GIT_INFO +versionString' = concat $ concat + [ [$(simpleVersion Meta.version)] + -- Leave out number of commits for --depth=1 clone + -- See https://github.com/commercialhaskell/stack/issues/792 + , case giCommitCount <$> $$tGitInfoCwdTry of + Left _ -> [] + Right 1 -> [] + Right count -> [" (", show count, " commits)"] + , [" ", Cabal.display buildArch] + , [depsString, warningString] + ] +#else +versionString' = + showVersion Meta.version + ++ ' ' : Cabal.display buildArch + ++ depsString + ++ warningString +#endif + where +#ifdef HIDE_DEP_VERSIONS + depsString = " hpack-" ++ VERSION_hpack +#else + depsString = "\nCompiled with:\n" ++ unlines (map ("- " ++) Build_stack.deps) +#endif +#ifdef SUPPORTED_BUILD + warningString = "" +#else + warningString = unlines + [ "" + , "Warning: this is an unsupported build that may use different versions of" + , "dependencies and GHC than the officially released binaries, and therefore may" + , "not behave identically. If you encounter problems, please try the latest" + , "official build by running 'stack upgrade --force-download'." + ] +#endif + +-- | If USE_GIT_INFO is enabled, the Git hash in the build directory, otherwise Nothing. +maybeGitHash :: Maybe String +maybeGitHash = +#ifdef USE_GIT_INFO + (either (const Nothing) (Just . giHash) $$tGitInfoCwdTry) +#else + Nothing +#endif + +-- | Hpack version we're compiled against +hpackVersion :: String +hpackVersion = VERSION_hpack diff --git a/src/main/Main.hs b/src/main/Main.hs index e45ea7d7bf..c26d881511 100644 --- a/src/main/Main.hs +++ b/src/main/Main.hs @@ -1,5 +1,4 @@ {-# LANGUAGE NoImplicitPrelude #-} -{-# LANGUAGE CPP #-} {-# LANGUAGE DeriveDataTypeable #-} {-# LANGUAGE FlexibleContexts #-} {-# LANGUAGE GADTs #-} @@ -7,17 +6,11 @@ {-# LANGUAGE ScopedTypeVariables #-} {-# LANGUAGE RecordWildCards #-} -#ifdef USE_GIT_INFO -{-# LANGUAGE TemplateHaskell #-} -#endif - -- | Main stack tool entry point. module Main (main) where -#ifndef HIDE_DEP_VERSIONS -import qualified Build_stack -#endif +import BuildInfo import Stack.Prelude hiding (Display (..)) import Control.Monad.Reader (local) import Control.Monad.Trans.Except (ExceptT) @@ -31,20 +24,12 @@ import qualified Data.Set as Set import qualified Data.Text as T import Data.Version (showVersion) import RIO.Process -#ifdef USE_GIT_INFO -import GitHash (giCommitCount, giHash, tGitInfoCwdTry) -#endif -import Distribution.System (buildArch) -import qualified Distribution.Text as Cabal (display) import Distribution.Version (mkVersion') import GHC.IO.Encoding (mkTextEncoding, textEncodingName) import Options.Applicative import Options.Applicative.Help (errorHelp, stringChunk, vcatChunks) import Options.Applicative.Builder.Extra import Options.Applicative.Complicated -#ifdef USE_GIT_INFO -import Options.Applicative.Simple (simpleVersion) -#endif import Options.Applicative.Types (ParserHelp(..)) import Pantry (loadSnapshot) import Path @@ -120,44 +105,6 @@ hSetTranslit h = do hSetEncoding h enc' _ -> return () -versionString' :: String -#ifdef USE_GIT_INFO -versionString' = concat $ concat - [ [$(simpleVersion Meta.version)] - -- Leave out number of commits for --depth=1 clone - -- See https://github.com/commercialhaskell/stack/issues/792 - , case giCommitCount <$> $$tGitInfoCwdTry of - Left _ -> [] - Right 1 -> [] - Right count -> [" (", show count, " commits)"] - , [" ", Cabal.display buildArch] - , [depsString, warningString] - ] -#else -versionString' = - showVersion Meta.version - ++ ' ' : Cabal.display buildArch - ++ depsString - ++ warningString -#endif - where -#ifdef HIDE_DEP_VERSIONS - depsString = " hpack-" ++ VERSION_hpack -#else - depsString = "\nCompiled with:\n" ++ unlines (map ("- " ++) Build_stack.deps) -#endif -#ifdef SUPPORTED_BUILD - warningString = "" -#else - warningString = unlines - [ "" - , "Warning: this is an unsupported build that may use different versions of" - , "dependencies and GHC than the officially released binaries, and therefore may" - , "not behave identically. If you encounter problems, please try the latest" - , "official build by running 'stack upgrade --force-download'." - ] -#endif - main :: IO () main = do -- Line buffer the output by default, particularly for non-terminal runs. @@ -218,7 +165,7 @@ commandLineHandler commandLineHandler currentDir progName isInterpreter = complicatedOptions (mkVersion' Meta.version) (Just versionString') - VERSION_hpack + hpackVersion "stack - The Haskell Tool Stack" "" "stack's documentation is available at https://docs.haskellstack.org/" @@ -675,11 +622,7 @@ upgradeCmd upgradeOpts' = do Nothing -> withGlobalProject $ upgrade -#ifdef USE_GIT_INFO - (either (const Nothing) (Just . giHash) $$tGitInfoCwdTry) -#else - Nothing -#endif + maybeGitHash upgradeOpts' -- | Upload to Hackage diff --git a/src/test/Stack/LockSpec.hs b/src/test/Stack/LockSpec.hs new file mode 100644 index 0000000000..3d15829987 --- /dev/null +++ b/src/test/Stack/LockSpec.hs @@ -0,0 +1,172 @@ +{-# LANGUAGE QuasiQuotes #-} +{-# LANGUAGE ScopedTypeVariables #-} +{-# LANGUAGE OverloadedStrings #-} + +module Stack.LockSpec where + +import Data.Aeson.Extended (WithJSONWarnings(..)) +import Data.ByteString (ByteString) +import qualified Data.Yaml as Yaml +import Distribution.Types.PackageName (mkPackageName) +import Distribution.Types.Version (mkVersion) +import Pantry +import qualified Pantry.SHA256 as SHA256 +import RIO +import Stack.Lock +import Test.Hspec +import Text.RawString.QQ + +toBlobKey :: ByteString -> Word -> BlobKey +toBlobKey string size = BlobKey (decodeSHA string) (FileSize size) + +decodeSHA :: ByteString -> SHA256 +decodeSHA string = + case SHA256.fromHexBytes string of + Right csha -> csha + Left err -> error $ "Failed decoding. Error: " <> show err + +decodeLocked :: ByteString -> IO Locked +decodeLocked bs = do + val <- Yaml.decodeThrow bs + case Yaml.parseEither Yaml.parseJSON val of + Left err -> throwIO $ Yaml.AesonException err + Right (WithJSONWarnings res warnings) -> do + unless (null warnings) $ + throwIO $ Yaml.AesonException $ "Unexpected warnings: " ++ show warnings + -- we just assume no file references + resolvePaths Nothing res + +spec :: Spec +spec = do + it "parses lock file (empty with GHC resolver)" $ do + let lockFile :: ByteString + lockFile = + [r|#some +snapshots: +- completed: + compiler: ghc-8.2.2 + original: + compiler: ghc-8.2.2 +packages: [] +|] + pkgImm <- lckPkgImmutableLocations <$> decodeLocked lockFile + pkgImm `shouldBe` [] + it "parses lock file (empty with LTS resolver)" $ do + let lockFile :: ByteString + lockFile = + [r|#some +snapshots: +- completed: + size: 527801 + url: https://raw.githubusercontent.com/commercialhaskell/stackage-snapshots/master/lts/11/22.yaml + sha256: 7c8b1853da784bd7beb8728168bf4e879d8a2f6daf408ca0fa7933451864a96a + original: lts-11.22 +- completed: + compiler: ghc-8.2.2 + original: + compiler: ghc-8.2.2 +packages: [] +|] + pkgImm <- lckPkgImmutableLocations <$> decodeLocked lockFile + pkgImm `shouldBe` [] + it "parses lock file (LTS, wai + warp)" $ do + let lockFile :: ByteString + lockFile = + [r|#some +snapshots: +- completed: + size: 527801 + url: https://raw.githubusercontent.com/commercialhaskell/stackage-snapshots/master/lts/11/22.yaml + sha256: 7c8b1853da784bd7beb8728168bf4e879d8a2f6daf408ca0fa7933451864a96a + original: lts-11.22 +- completed: + compiler: ghc-8.2.2 + original: + compiler: ghc-8.2.2 +packages: +- original: + subdir: wai + git: https://github.com/yesodweb/wai.git + commit: d11d63f1a6a92db8c637a8d33e7953ce6194a3e0 + completed: + subdir: wai + cabal-file: + size: 1765 + sha256: eea52c4967d8609c2f79213d6dffe6d6601034f1471776208404781de7051410 + name: wai + version: 3.2.1.2 + git: https://github.com/yesodweb/wai.git + pantry-tree: + size: 714 + sha256: ecfd0b4b75f435a3f362394807b35e5ef0647b1a25005d44a3632c49db4833d2 + commit: d11d63f1a6a92db8c637a8d33e7953ce6194a3e0 +- original: + subdir: warp + git: https://github.com/yesodweb/wai.git + commit: d11d63f1a6a92db8c637a8d33e7953ce6194a3e0 + completed: + subdir: warp + cabal-file: + size: 10725 + sha256: cfec5336260bb4b1ecbd833f7d6948fd1ee373770471fe79796cf9c389c71758 + name: warp + version: 3.2.25 + git: https://github.com/yesodweb/wai.git + pantry-tree: + size: 5103 + sha256: f808e075811b002563d24c393ce115be826bb66a317d38da22c513ee42b7443a + commit: d11d63f1a6a92db8c637a8d33e7953ce6194a3e0 +|] + pkgImm <- lckPkgImmutableLocations <$> decodeLocked lockFile + let waiSubdirRepo subdir = + Repo { repoType = RepoGit + , repoUrl = "https://github.com/yesodweb/wai.git" + , repoCommit = + "d11d63f1a6a92db8c637a8d33e7953ce6194a3e0" + , repoSubdir = subdir + } + emptyRPM = RawPackageMetadata { rpmName = Nothing + , rpmVersion = Nothing + , rpmTreeKey = Nothing + , rpmCabal = Nothing + } + pkgImm `shouldBe` + [ LockedLocation + (RPLIRepo (waiSubdirRepo "wai") emptyRPM) + (PLIRepo (waiSubdirRepo "wai") + (PackageMetadata { pmIdent = + PackageIdentifier + { pkgName = mkPackageName "wai" + , pkgVersion = mkVersion [3, 2, 1, 2] + } + , pmTreeKey = + TreeKey + (BlobKey + (decodeSHA + "ecfd0b4b75f435a3f362394807b35e5ef0647b1a25005d44a3632c49db4833d2") + (FileSize 714)) + , pmCabal = + toBlobKey + "eea52c4967d8609c2f79213d6dffe6d6601034f1471776208404781de7051410" + 1765 + })) + , LockedLocation + (RPLIRepo (waiSubdirRepo "warp") emptyRPM) + (PLIRepo (waiSubdirRepo "warp") + (PackageMetadata { pmIdent = + PackageIdentifier + { pkgName = mkPackageName "warp" + , pkgVersion = mkVersion [3, 2, 25] + } + , pmTreeKey = + TreeKey + (BlobKey + (decodeSHA + "f808e075811b002563d24c393ce115be826bb66a317d38da22c513ee42b7443a") + (FileSize 5103)) + , pmCabal = + toBlobKey + "cfec5336260bb4b1ecbd833f7d6948fd1ee373770471fe79796cf9c389c71758" + 10725 + })) + ] diff --git a/src/test/Stack/ModuleInterfaceSpec.hs b/src/test/Stack/ModuleInterfaceSpec.hs new file mode 100644 index 0000000000..96c8c309fb --- /dev/null +++ b/src/test/Stack/ModuleInterfaceSpec.hs @@ -0,0 +1,58 @@ +{-# LANGUAGE NoImplicitPrelude #-} +{-# LANGUAGE OverloadedStrings #-} + +module Stack.ModuleInterfaceSpec where + +import Data.Foldable (traverse_) +import Data.Semigroup ((<>)) +import qualified Stack.ModuleInterface as Iface +import Stack.Prelude hiding (Version) +import System.Directory (doesFileExist) +import Test.Hspec (Spec, describe, it, shouldBe) + +type Version = String +type Architecture = String +type Directory = FilePath +type Usage = String +type Module = ByteString + +versions32 :: [Version] +versions32 = ["ghc7103", "ghc802", "ghc822", "ghc844"] + +versions64 :: [Version] +versions64 = ["ghc822", "ghc844", "ghc864"] + +spec :: Spec +spec = describe "should succesfully deserialize x32 interface for" $ do + traverse_ (deserialize check32) (("x32/" <>) <$> versions32) + traverse_ (deserialize check64) (("x64/" <>) <$> versions64) + +check32 :: Iface.Interface -> IO () +check32 iface = do + hasExpectedUsage "some-dependency.txt" iface `shouldBe` True + +check64 :: Iface.Interface -> IO () +check64 iface = do + hasExpectedUsage "Test.h" iface `shouldBe` True + hasExpectedUsage "README.md" iface `shouldBe` True + hasExpectedModule "X" iface `shouldBe` True + +deserialize :: (Iface.Interface -> IO ()) -> Directory -> Spec +deserialize check d = do + it d $ do + let ifacePath = "test/files/iface/" <> d <> "/Main.hi" + exists <- doesFileExist ifacePath + when exists $ do + result <- Iface.fromFile ifacePath + case result of + (Left msg) -> fail msg + (Right iface) -> check iface + +-- | `Usage` is the name given by GHC to TH dependency +hasExpectedUsage :: Usage -> Iface.Interface -> Bool +hasExpectedUsage u = + elem u . fmap Iface.unUsage . Iface.unList . Iface.usage + +hasExpectedModule :: Module -> Iface.Interface -> Bool +hasExpectedModule m = + elem m . fmap fst . Iface.unList . Iface.dmods . Iface.deps diff --git a/subs/curator/app/Main.hs b/subs/curator/app/Main.hs index b59d1fa2d1..9db9561e1f 100644 --- a/subs/curator/app/Main.hs +++ b/subs/curator/app/Main.hs @@ -11,13 +11,11 @@ import Network.HTTP.Download (download) import Options.Applicative.Simple hiding (action) import qualified Pantry import Path (toFilePath) -import Path.IO (doesFileExist, resolveFile', resolveDir') +import Path.IO (doesFileExist, removeFile, resolveFile', resolveDir') import Paths_curator (version) import qualified RIO.ByteString.Lazy as BL import RIO.List (stripPrefix) import qualified RIO.Map as Map -import RIO.PrettyPrint -import RIO.PrettyPrint.StylesUpdate import RIO.Process import qualified RIO.Text as T import RIO.Time @@ -106,18 +104,22 @@ update = do constraints :: Target -> RIO PantryApp () constraints target = - withFixedColorTerm $ case target of + case target of TargetLts x y | y > 0 -> do let prev = y - 1 - url = concat [ "https://raw.githubusercontent.com/commercialhaskell/stackage-constraints/master/lts-" + url = concat [ "https://raw.githubusercontent.com/" ++ constraintsRepo ++ "/master/lts/" , show x - , "." + , "/" , show prev , ".yaml" ] - logInfo $ "Reusing constraints.yaml from lts-" <> display x <> "." <> display prev + logInfo $ "Will reuse constraints.yaml from lts-" <> display x <> "." <> display prev req <- parseUrlThrow url constraintsPath <- resolveFile' constraintsFilename + exists <- doesFileExist constraintsPath + when exists $ do + logWarn "Local constraints file will be deleted before downloading reused constraints" + removeFile constraintsPath downloaded <- download req constraintsPath unless downloaded $ error $ "Could not download constraints.yaml from " <> url @@ -151,50 +153,20 @@ snapshot = do complete <- completeSnapshotLayer incomplete liftIO $ encodeFile snapshotFilename complete -loadSnapshotYaml :: RIO PantryApp Pantry.RawSnapshot +loadSnapshotYaml :: RIO PantryApp Pantry.Snapshot loadSnapshotYaml = do abs' <- resolveFile' snapshotFilename - loadSnapshot $ SLFilePath $ - ResolvedPath (RelFilePath (fromString snapshotFilename)) abs' + let sloc = SLFilePath $ + ResolvedPath (RelFilePath (fromString snapshotFilename)) abs' + (snap, _, _) <- loadAndCompleteSnapshot sloc Map.empty Map.empty + pure snap checkSnapshot :: RIO PantryApp () checkSnapshot = do logInfo "Checking dependencies in snapshot.yaml" decodeFileThrow constraintsFilename >>= \constraints' -> do snapshot' <- loadSnapshotYaml - withFixedColorTerm $ checkDependencyGraph constraints' snapshot' - -data FixedColorTermApp = FixedColorTermApp - { fctApp :: PantryApp - , fctWidth :: Int - } - -pantryAppL :: Lens' FixedColorTermApp PantryApp -pantryAppL = lens fctApp (\s a -> s{ fctApp = a}) - -instance HasLogFunc FixedColorTermApp where - logFuncL = pantryAppL.logFuncL - -instance HasStylesUpdate FixedColorTermApp where - stylesUpdateL = lens (const $ StylesUpdate []) (\s _ -> s) - -instance HasTerm FixedColorTermApp where - useColorL = lens (const True) (\s _ -> s) - termWidthL = lens fctWidth (\s w -> s{ fctWidth = w }) - -instance HasPantryConfig FixedColorTermApp where - pantryConfigL = pantryAppL.pantryConfigL - -instance HasProcessContext FixedColorTermApp where - processContextL = pantryAppL.processContextL - -withFixedColorTerm :: RIO FixedColorTermApp a -> RIO PantryApp a -withFixedColorTerm action = do - app <- ask - runRIO (FixedColorTermApp app defaultTerminalWidth) action - -defaultTerminalWidth :: Int -defaultTerminalWidth = 100 + checkDependencyGraph constraints' snapshot' unpackDir :: FilePath unpackDir = "unpack-dir" @@ -202,9 +174,7 @@ unpackDir = "unpack-dir" unpackFiles :: RIO PantryApp () unpackFiles = do logInfo "Unpacking files" - abs' <- resolveFile' snapshotFilename - snapshot' <- loadSnapshot $ SLFilePath $ - ResolvedPath (RelFilePath (fromString snapshotFilename)) abs' + snapshot' <- loadSnapshotYaml constraints' <- decodeFileThrow constraintsFilename dest <- resolveDir' unpackDir unpackSnapshot constraints' snapshot' dest @@ -222,7 +192,7 @@ hackageDistro target = do logInfo "Uploading Hackage distro for snapshot.yaml" snapshot' <- loadSnapshotYaml let packageVersions = - Map.mapMaybe (snapshotVersion . rspLocation) (rsPackages snapshot') + Map.mapMaybe (snapshotVersion . spLocation) (snapshotPackages snapshot') uploadHackageDistro target packageVersions uploadDocs' :: Target -> RIO PantryApp () @@ -251,4 +221,4 @@ loadPantrySnapshotLayerFile fp = do eres <- loadSnapshotLayer $ SLFilePath (ResolvedPath (RelFilePath (fromString fp)) abs') case eres of Left x -> error $ "should not happen: " ++ show (fp, x) - Right (x, _) -> pure x + Right x -> pure x diff --git a/subs/curator/src/Curator/Constants.hs b/subs/curator/src/Curator/Constants.hs index ee42a911b1..f14ee0ebb6 100644 --- a/subs/curator/src/Curator/Constants.hs +++ b/subs/curator/src/Curator/Constants.hs @@ -1,6 +1,8 @@ module Curator.Constants ( snapshotFilename , constraintsFilename + , snapshotsRepo + , constraintsRepo ) where snapshotFilename :: FilePath @@ -8,3 +10,9 @@ snapshotFilename = "snapshot.yaml" constraintsFilename :: FilePath constraintsFilename = "constraints.yaml" + +snapshotsRepo :: String +snapshotsRepo = "commercialhaskell/stackage-next" + +constraintsRepo :: String +constraintsRepo = "commercialhaskell/stackage-constraints-next" diff --git a/subs/curator/src/Curator/Repo.hs b/subs/curator/src/Curator/Repo.hs index 373c17e840..33f0a7ecf5 100644 --- a/subs/curator/src/Curator/Repo.hs +++ b/subs/curator/src/Curator/Repo.hs @@ -58,7 +58,7 @@ checkoutSnapshotsRepo :: -> m ([String] -> m (), Path Abs File, String) checkoutSnapshotsRepo t = checkoutRepo t dir url where - url = "git@github.com:commercialhaskell/stackage-next" + url = "git@github.com:" ++ snapshotsRepo dir = $(mkRelDir "stackage-snapshots") checkoutConstraintsRepo :: @@ -72,7 +72,7 @@ checkoutConstraintsRepo :: -> m ([String] -> m (), Path Abs File, String) checkoutConstraintsRepo t = checkoutRepo t dir url where - url = "git@github.com:commercialhaskell/stackage-constraints-next" + url = "git@github.com:" ++ constraintsRepo dir = $(mkRelDir "stackage-constraints") checkoutRepo :: diff --git a/subs/curator/src/Curator/Snapshot.hs b/subs/curator/src/Curator/Snapshot.hs index b83d536a64..033098ce4e 100644 --- a/subs/curator/src/Curator/Snapshot.hs +++ b/subs/curator/src/Curator/Snapshot.hs @@ -38,6 +38,7 @@ import RIO.Seq (Seq) import qualified RIO.Seq as Seq import qualified RIO.Text as T import qualified RIO.Text.Partial as TP +import RIO.Time (getCurrentTime) makeSnapshot :: (HasPantryConfig env, HasLogFunc env, HasProcessContext env) @@ -49,6 +50,7 @@ makeSnapshot cons = do Map.toList $ consPackages cons let snapshotPackages = Set.fromList [ pn | (pn, Just _) <- locs ] inSnapshot pn = pn `Set.member` snapshotPackages + now <- getCurrentTime pure RawSnapshotLayer { rslParent = RSLCompiler $ WCGhc $ consGhcVersion cons @@ -60,6 +62,7 @@ makeSnapshot cons = do , rslHidden = Map.filterWithKey (\pn hide -> hide && inSnapshot pn) (pcHide <$> consPackages cons) , rslGhcOptions = mempty + , rslPublishTime = Just now } getFlags :: PackageConstraints -> Maybe (Map FlagName Bool) @@ -75,7 +78,7 @@ toLoc toLoc name pc = case pcSource pc of PSHackage (HackageSource mrange mrequiredLatest revisions) -> do - versions <- getHackagePackageVersions IgnorePreferredVersions name -- don't follow the preferred versions on Hackage, give curators more control + versions <- getHackagePackageVersions YesRequireHackageIndex IgnorePreferredVersions name -- don't follow the preferred versions on Hackage, give curators more control when (Map.null versions) $ error $ "Package not found on Hackage: " ++ packageNameString name for_ mrequiredLatest $ \required -> case Map.maxViewWithKey versions of @@ -131,18 +134,18 @@ instance Exception TraverseValidateExceptions checkDependencyGraph :: (HasTerm env, HasProcessContext env, HasPantryConfig env) => Constraints - -> RawSnapshot + -> Snapshot -> RIO env () checkDependencyGraph constraints snapshot = do - let compiler = rsCompiler snapshot + let compiler = snapshotCompiler snapshot compilerVer = case compiler of WCGhc v -> v WCGhcGit {} -> error "GHC-GIT is not supported" WCGhcjs _ _ -> error "GHCJS is not supported" let snapshotPackages = Map.fromList - [ (pn, snapshotVersion (rspLocation sp)) - | (pn, sp) <- Map.toList (rsPackages snapshot) + [ (pn, snapshotVersion (spLocation sp)) + | (pn, sp) <- Map.toList (Pantry.snapshotPackages snapshot) ] ghcBootPackages0 <- liftIO $ getBootPackages compilerVer let ghcBootPackages = prunedBootPackages ghcBootPackages0 (Map.keysSet snapshotPackages) @@ -157,7 +160,7 @@ checkDependencyGraph constraints snapshot = do Just (Just cabalVersion) -> do let isWiredIn pn _ = pn `Set.member` wiredInGhcPackages (wiredIn, packages) = - Map.partitionWithKey isWiredIn (rsPackages snapshot) + Map.partitionWithKey isWiredIn (Pantry.snapshotPackages snapshot) if not (Map.null wiredIn) then do let errMsg = "GHC wired-in package can not be overriden" @@ -259,8 +262,8 @@ pkgBoundsError dep maintainers mdepVer isBoot users = display :: DT.Text a => a -> Text display = T.pack . DT.display -snapshotVersion :: RawPackageLocationImmutable -> Maybe Version -snapshotVersion (RPLIHackage (PackageIdentifierRevision _ v _) _) = Just v +snapshotVersion :: PackageLocationImmutable -> Maybe Version +snapshotVersion (PLIHackage (PackageIdentifier _ v) _ _) = Just v snapshotVersion _ = Nothing data DependencyError = @@ -347,10 +350,10 @@ getPkgInfo :: => Constraints -> Version -> PackageName - -> RawSnapshotPackage + -> SnapshotPackage -> RIO env PkgInfo -getPkgInfo constraints compilerVer pname rsp = do - gpd <- loadCabalFileRawImmutable (rspLocation rsp) +getPkgInfo constraints compilerVer pname sp = do + gpd <- loadCabalFileImmutable (spLocation sp) logDebug $ "Extracting deps for " <> displayShow pname let mpc = Map.lookup pname (consPackages constraints) skipBuild = maybe False pcSkipBuild mpc @@ -395,7 +398,7 @@ getPkgInfo constraints compilerVer pname rsp = do , comp == CompLibrary || comp == CompExecutable , dep <- deps ] return PkgInfo - { piVersion = snapshotVersion (rspLocation rsp) + { piVersion = snapshotVersion (spLocation sp) , piAllDeps = allDeps , piTreeDeps = treeDeps , piCabalVersion = C.specVersion $ C.packageDescription gpd diff --git a/subs/curator/src/Curator/Unpack.hs b/subs/curator/src/Curator/Unpack.hs index 13c6198237..b52088e242 100644 --- a/subs/curator/src/Curator/Unpack.hs +++ b/subs/curator/src/Curator/Unpack.hs @@ -19,25 +19,29 @@ import qualified RIO.Set as Set unpackSnapshot :: (HasPantryConfig env, HasLogFunc env, HasProcessContext env) => Constraints - -> RawSnapshot + -> Snapshot -> Path Abs Dir -> RIO env () unpackSnapshot cons snap root = do unpacked <- parseRelDir "unpacked" (suffixes, flags, (skipTest, expectTestFailure), (skipBench, expectBenchmarkFailure), - (skipHaddock, expectHaddockFailure)) <- fmap fold $ for (rsPackages snap) $ \sp -> do - let pl = rspLocation sp - TreeKey (BlobKey sha _size) <- getRawPackageLocationTreeKey pl - PackageIdentifier name version <- getRawPackageLocationIdent pl + (skipHaddock, expectHaddockFailure)) <- fmap fold $ for (snapshotPackages snap) $ \sp -> do + let pl = spLocation sp + TreeKey (BlobKey sha _size) <- getPackageLocationTreeKey pl + let (PackageIdentifier name version) = + case pl of + PLIHackage ident _ _ -> ident + PLIArchive _ pm -> pmIdent pm + PLIRepo _ pm -> pmIdent pm let (flags, skipBuild, test, bench, haddock) = case Map.lookup name $ consPackages cons of Nothing -> (mempty, False, CAExpectSuccess, CAExpectSuccess, CAExpectSuccess) Just pc -> (pcFlags pc, pcSkipBuild pc, pcTests pc, pcBenchmarks pc, pcHaddock pc) - unless (flags == rspFlags sp) $ error $ unlines + unless (flags == spFlags sp) $ error $ unlines [ "mismatched flags for " ++ show pl - , " snapshot: " ++ show (rspFlags sp) + , " snapshot: " ++ show (spFlags sp) , " constraints: " ++ show flags ] if skipBuild @@ -58,7 +62,7 @@ unpackSnapshot cons snap root = do ignoringAbsence $ removeDirRecur destTmp ensureDir destTmp logInfo $ "Unpacking " <> display pl - unpackPackageLocationRaw destTmp pl + unpackPackageLocation destTmp pl renameDir destTmp dest pure ( Set.singleton suffix diff --git a/subs/pantry/.hindent.yaml b/subs/pantry/.hindent.yaml new file mode 100644 index 0000000000..5e5e32ff0f --- /dev/null +++ b/subs/pantry/.hindent.yaml @@ -0,0 +1 @@ +indent-size: 2 diff --git a/subs/pantry/attic/symlink-to-dir.tar.gz b/subs/pantry/attic/symlink-to-dir.tar.gz new file mode 100644 index 0000000000..64871ee4c5 Binary files /dev/null and b/subs/pantry/attic/symlink-to-dir.tar.gz differ diff --git a/subs/pantry/package.yaml b/subs/pantry/package.yaml index bbbd842967..4fb2c52a60 100644 --- a/subs/pantry/package.yaml +++ b/subs/pantry/package.yaml @@ -86,6 +86,9 @@ dependencies: - directory - filepath +ghc-options: + - -Wall + library: source-dirs: src/ when: @@ -104,6 +107,11 @@ library: # For testing - Pantry.Internal - Pantry.Internal.StaticBytes + # For stackage-server + - Pantry.Internal.Stackage + + # For stack + - Pantry.Internal.Companion # FIXME must be removed from pantry! - Data.Aeson.Extended @@ -118,3 +126,4 @@ tests: - exceptions - hedgehog - QuickCheck + - raw-strings-qq diff --git a/subs/pantry/src/Pantry.hs b/subs/pantry/src/Pantry.hs index 167ba1ced5..88b48eaaf8 100644 --- a/subs/pantry/src/Pantry.hs +++ b/subs/pantry/src/Pantry.hs @@ -1,7 +1,6 @@ {-# LANGUAGE NoImplicitPrelude #-} {-# LANGUAGE OverloadedStrings #-} {-# LANGUAGE ScopedTypeVariables #-} -{-# LANGUAGE ViewPatterns #-} -- | Content addressable Haskell package management, providing for -- secure, reproducible acquisition of Haskell package contents and -- metadata. @@ -93,6 +92,8 @@ module Pantry , loadSnapshotLayer , loadSnapshot , loadAndCompleteSnapshot + , loadAndCompleteSnapshotRaw + , CompletedPLI , addPackagesToSnapshot , AddPackagesConfig (..) @@ -105,6 +106,7 @@ module Pantry , parseWantedCompiler , parseRawSnapshotLocation , parsePackageIdentifierRevision + , parseHackageText -- ** Cabal values , parsePackageIdentifier @@ -154,6 +156,7 @@ module Pantry -- * Hackage index , updateHackageIndex , DidUpdateOccur (..) + , RequireHackageIndex (..) , hackageIndexTarballL , getHackagePackageVersions , getLatestHackageVersion @@ -168,6 +171,7 @@ module Pantry import RIO import Conduit +import Control.Arrow (right) import Control.Monad.State.Strict (State, execState, get, modify') import qualified RIO.Map as Map import qualified RIO.Set as Set @@ -178,7 +182,7 @@ import qualified RIO.FilePath as FilePath import Pantry.Archive import Pantry.Repo import qualified Pantry.SHA256 as SHA256 -import Pantry.Storage +import Pantry.Storage hiding (TreeEntry, PackageName, Version) import Pantry.Tree import Pantry.Types import Pantry.Hackage @@ -191,6 +195,7 @@ import qualified Hpack import qualified Hpack.Config as Hpack import Network.HTTP.Download import RIO.PrettyPrint +import RIO.PrettyPrint.StylesUpdate import RIO.Process import RIO.Directory (getAppUserDataDirectory) import qualified Data.Yaml as Yaml @@ -267,11 +272,12 @@ defaultHackageSecurityConfig = HackageSecurityConfig -- @since 0.1.0.0 getLatestHackageVersion :: (HasPantryConfig env, HasLogFunc env) - => PackageName -- ^ package name + => RequireHackageIndex + -> PackageName -- ^ package name -> UsePreferredVersions -> RIO env (Maybe PackageIdentifierRevision) -getLatestHackageVersion name preferred = - ((fmap fst . Map.maxViewWithKey) >=> go) <$> getHackagePackageVersions preferred name +getLatestHackageVersion req name preferred = + ((fmap fst . Map.maxViewWithKey) >=> go) <$> getHackagePackageVersions req preferred name where go (version, m) = do (_rev, BlobKey sha size) <- fst <$> Map.maxViewWithKey m @@ -283,12 +289,13 @@ getLatestHackageVersion name preferred = -- @since 0.1.0.0 getLatestHackageLocation :: (HasPantryConfig env, HasLogFunc env, HasProcessContext env) - => PackageName -- ^ package name + => RequireHackageIndex + -> PackageName -- ^ package name -> UsePreferredVersions -> RIO env (Maybe PackageLocationImmutable) -getLatestHackageLocation name preferred = do +getLatestHackageLocation req name preferred = do mversion <- - fmap fst . Map.maxViewWithKey <$> getHackagePackageVersions preferred name + fmap fst . Map.maxViewWithKey <$> getHackagePackageVersions req preferred name let mVerCfKey = do (version, revisions) <- mversion (_rev, cfKey) <- fst <$> Map.maxViewWithKey revisions @@ -296,8 +303,8 @@ getLatestHackageLocation name preferred = do forM mVerCfKey $ \(version, cfKey@(BlobKey sha size)) -> do let pir = PackageIdentifierRevision name version (CFIHash sha (Just size)) - treeKey <- getHackageTarballKey pir - pure $ PLIHackage (PackageIdentifier name version) cfKey treeKey + treeKey' <- getHackageTarballKey pir + pure $ PLIHackage (PackageIdentifier name version) cfKey treeKey' -- | Returns the latest revision of the given package version available from -- Hackage. @@ -305,17 +312,18 @@ getLatestHackageLocation name preferred = do -- @since 0.1.0.0 getLatestHackageRevision :: (HasPantryConfig env, HasLogFunc env, HasProcessContext env) - => PackageName -- ^ package name + => RequireHackageIndex + -> PackageName -- ^ package name -> Version -> RIO env (Maybe (Revision, BlobKey, TreeKey)) -getLatestHackageRevision name version = do - revisions <- getHackagePackageVersionRevisions name version +getLatestHackageRevision req name version = do + revisions <- getHackagePackageVersionRevisions req name version case fmap fst $ Map.maxViewWithKey revisions of Nothing -> pure Nothing Just (revision, cfKey@(BlobKey sha size)) -> do let cfi = CFIHash sha (Just size) - treeKey <- getHackageTarballKey (PackageIdentifierRevision name version cfi) - return $ Just (revision, cfKey, treeKey) + treeKey' <- getHackageTarballKey (PackageIdentifierRevision name version cfi) + return $ Just (revision, cfKey, treeKey') fetchTreeKeys :: (HasPantryConfig env, HasLogFunc env, Foldable f) @@ -392,7 +400,7 @@ loadCabalFileImmutable loadCabalFileImmutable loc = withCache $ do logDebug $ "Parsing cabal file for " <> display loc bs <- loadCabalFileBytes loc - let foundCabalKey = BlobKey (SHA256.hashBytes bs) (FileSize (fromIntegral (B.length bs))) + let foundCabalKey = bsToBlobKey bs (_warnings, gpd) <- rawParseGPD (Left $ toRawPLI loc) bs let pm = case loc of @@ -438,7 +446,7 @@ loadCabalFileRawImmutable loadCabalFileRawImmutable loc = withCache $ do logDebug $ "Parsing cabal file for " <> display loc bs <- loadRawCabalFileBytes loc - let foundCabalKey = BlobKey (SHA256.hashBytes bs) (FileSize (fromIntegral (B.length bs))) + let foundCabalKey = bsToBlobKey bs (_warnings, gpd) <- rawParseGPD (Left loc) bs let rpm = case loc of @@ -698,7 +706,8 @@ loadPackage :: (HasPantryConfig env, HasLogFunc env, HasProcessContext env) => PackageLocationImmutable -> RIO env Package -loadPackage (PLIHackage ident cfHash tree) = getHackageTarball (pirForHash ident cfHash) (Just tree) +loadPackage (PLIHackage ident cfHash tree) = + htrPackage <$> getHackageTarball (pirForHash ident cfHash) (Just tree) loadPackage pli@(PLIArchive archive pm) = getArchivePackage (toRawPLI pli) (toRawArchive archive) (toRawPM pm) loadPackage (PLIRepo repo pm) = getRepo repo (toRawPM pm) @@ -709,7 +718,7 @@ loadPackageRaw :: (HasPantryConfig env, HasLogFunc env, HasProcessContext env) => RawPackageLocationImmutable -> RIO env Package -loadPackageRaw (RPLIHackage pir mtree) = getHackageTarball pir mtree +loadPackageRaw (RPLIHackage pir mtree) = htrPackage <$> getHackageTarball pir mtree loadPackageRaw rpli@(RPLIArchive archive pm) = getArchivePackage rpli archive pm loadPackageRaw (RPLIRepo repo rpm) = getRepo repo rpm @@ -735,8 +744,8 @@ completePackageLocation (RPLIHackage pir0@(PackageIdentifierRevision name versio pir = PackageIdentifierRevision name version cfi logDebug $ "Added in cabal file hash: " <> display pir pure (pir, BlobKey sha size) - treeKey <- getHackageTarballKey pir - pure $ PLIHackage (PackageIdentifier name version) cfKey treeKey + treeKey' <- getHackageTarballKey pir + pure $ PLIHackage (PackageIdentifier name version) cfKey treeKey' completePackageLocation pl@(RPLIArchive archive rpm) = do -- getArchive checks archive and package metadata (sha, size, package) <- getArchive pl archive rpm @@ -791,8 +800,7 @@ completeSnapshotLocation (RSLFilePath f) = pure $ SLFilePath f completeSnapshotLocation (RSLUrl url (Just blobKey)) = pure $ SLUrl url blobKey completeSnapshotLocation (RSLUrl url Nothing) = do bs <- loadFromURL url Nothing - let blobKey = BlobKey (SHA256.hashBytes bs) (FileSize $ fromIntegral $ B.length bs) - pure $ SLUrl url blobKey + pure $ SLUrl url (bsToBlobKey bs) -- | Fill in optional fields in a 'SnapshotLayer' for more reproducible builds. -- @@ -812,6 +820,7 @@ completeSnapshotLayer rsnapshot = do , slFlags = rslFlags rsnapshot , slHidden = rslHidden rsnapshot , slGhcOptions = rslGhcOptions rsnapshot + , slPublishTime = rslPublishTime rsnapshot } traverseConcurrently_ @@ -900,7 +909,7 @@ loadSnapshotRaw loc = do , rsPackages = mempty , rsDrop = mempty } - Right (rsl, _sha) -> do + Right (rsl, _) -> do snap0 <- loadSnapshotRaw $ rslParent rsl (packages, unused) <- addPackagesToSnapshot @@ -936,7 +945,7 @@ loadSnapshot loc = do , rsPackages = mempty , rsDrop = mempty } - Right (rsl, _sha) -> do + Right rsl -> do snap0 <- loadSnapshotRaw $ rslParent rsl (packages, unused) <- addPackagesToSnapshot @@ -957,6 +966,7 @@ loadSnapshot loc = do } type CompletedPLI = (RawPackageLocationImmutable, PackageLocationImmutable) +type CompletedSL = (RawSnapshotLocation, SnapshotLocation) -- | Parse a 'Snapshot' (all layers) from a 'SnapshotLocation' noting -- any incomplete package locations @@ -965,9 +975,11 @@ type CompletedPLI = (RawPackageLocationImmutable, PackageLocationImmutable) loadAndCompleteSnapshot :: (HasPantryConfig env, HasLogFunc env, HasProcessContext env) => SnapshotLocation - -> RIO env (Snapshot, [CompletedPLI]) -loadAndCompleteSnapshot loc = - loadAndCompleteSnapshotRaw (toRawSL loc) + -> Map RawSnapshotLocation SnapshotLocation -- ^ Cached snapshot locations from lock file + -> Map RawPackageLocationImmutable PackageLocationImmutable -- ^ Cached locations from lock file + -> RIO env (Snapshot, [CompletedSL], [CompletedPLI]) +loadAndCompleteSnapshot loc cachedSL cachedPL = + loadAndCompleteSnapshotRaw (toRawSL loc) cachedSL cachedPL -- | Parse a 'Snapshot' (all layers) from a 'RawSnapshotLocation' completing -- any incomplete package locations @@ -976,9 +988,13 @@ loadAndCompleteSnapshot loc = loadAndCompleteSnapshotRaw :: (HasPantryConfig env, HasLogFunc env, HasProcessContext env) => RawSnapshotLocation - -> RIO env (Snapshot, [CompletedPLI]) -loadAndCompleteSnapshotRaw loc = do - eres <- loadRawSnapshotLayer loc + -> Map RawSnapshotLocation SnapshotLocation -- ^ Cached snapshot locations from lock file + -> Map RawPackageLocationImmutable PackageLocationImmutable -- ^ Cached locations from lock file + -> RIO env (Snapshot, [CompletedSL], [CompletedPLI]) +loadAndCompleteSnapshotRaw rawLoc cacheSL cachePL = do + eres <- case Map.lookup rawLoc cacheSL of + Just loc -> right (\rsl -> (rsl, (rawLoc, loc))) <$> loadSnapshotLayer loc + Nothing -> loadRawSnapshotLayer rawLoc case eres of Left wc -> let snapshot = Snapshot @@ -986,12 +1002,13 @@ loadAndCompleteSnapshotRaw loc = do , snapshotPackages = mempty , snapshotDrop = mempty } - in pure (snapshot, []) - Right (rsl, _sha) -> do - (snap0, completed0) <- loadAndCompleteSnapshotRaw $ rslParent rsl + in pure (snapshot, [(RSLCompiler wc, SLCompiler wc)], []) + Right (rsl, sloc) -> do + (snap0, slocs, completed0) <- loadAndCompleteSnapshotRaw (rslParent rsl) cacheSL cachePL (packages, completed, unused) <- addAndCompletePackagesToSnapshot - (display loc) + rawLoc + cachePL (rslLocations rsl) AddPackagesConfig { apcDrop = rslDropPackages rsl @@ -1000,13 +1017,13 @@ loadAndCompleteSnapshotRaw loc = do , apcGhcOptions = rslGhcOptions rsl } (snapshotPackages snap0) - warnUnusedAddPackagesConfig (display loc) unused + warnUnusedAddPackagesConfig (display rawLoc) unused let snapshot = Snapshot { snapshotCompiler = fromMaybe (snapshotCompiler snap0) (rslCompiler rsl) , snapshotPackages = packages , snapshotDrop = apcDrop unused } - return (snapshot, completed ++ completed0) + return (snapshot, sloc : slocs,completed0 ++ completed) data SingleOrNot a = Single !a @@ -1120,6 +1137,16 @@ addPackagesToSnapshot source newPackages (AddPackagesConfig drops flags hiddens pure (allPackages, unused) +cachedSnapshotCompletePackageLocation :: (HasPantryConfig env, HasLogFunc env, HasProcessContext env) + => Map RawPackageLocationImmutable PackageLocationImmutable + -> RawPackageLocationImmutable + -> RIO env PackageLocationImmutable +cachedSnapshotCompletePackageLocation cachePackages rpli = do + let xs = Map.lookup rpli cachePackages + case xs of + Nothing -> completePackageLocation rpli + Just x -> pure x + -- | Add more packages to a snapshot completing their locations if needed -- -- Note that any settings on a parent flag which is being replaced will be @@ -1127,31 +1154,39 @@ addPackagesToSnapshot source newPackages (AddPackagesConfig drops flags hiddens -- set, and @foo@ also appears in new packages, then @bar@ will no longer be -- set. -- --- Returns any of the 'AddPackagesConfig' values not used. +-- Returns any of the 'AddPackagesConfig' values not used and also all +-- non-trivial package location completions. -- -- @since 0.1.0.0 addAndCompletePackagesToSnapshot :: (HasPantryConfig env, HasLogFunc env, HasProcessContext env) - => Utf8Builder + => RawSnapshotLocation -- ^ Text description of where these new packages are coming from, for error -- messages only + -> Map RawPackageLocationImmutable PackageLocationImmutable -- ^ Cached data from snapshot lock file -> [RawPackageLocationImmutable] -- ^ new packages -> AddPackagesConfig -> Map PackageName SnapshotPackage -- ^ packages from parent -> RIO env (Map PackageName SnapshotPackage, [CompletedPLI], AddPackagesConfig) -addAndCompletePackagesToSnapshot source newPackages (AddPackagesConfig drops flags hiddens options) old = do - let addPackage (ps, completed) loc = do - name <- getPackageLocationName loc - loc' <- completePackageLocation loc - let p = (name, SnapshotPackage - { spLocation = loc' +addAndCompletePackagesToSnapshot loc cachedPL newPackages (AddPackagesConfig drops flags hiddens options) old = do + let source = display loc + addPackage :: (HasPantryConfig env, HasLogFunc env, HasProcessContext env) + => ([(PackageName, SnapshotPackage)],[CompletedPLI]) + -> RawPackageLocationImmutable + -> RIO env ([(PackageName, SnapshotPackage)], [CompletedPLI]) + addPackage (ps, completed) rawLoc = do + complLoc <- cachedSnapshotCompletePackageLocation cachedPL rawLoc + let PackageIdentifier name _ = pliIdent complLoc + p = (name, SnapshotPackage + { spLocation = complLoc , spFlags = Map.findWithDefault mempty name flags , spHidden = Map.findWithDefault False name hiddens , spGhcOptions = Map.findWithDefault [] name options }) - if toRawPLI loc' == loc - then pure (p:ps, completed) - else pure (p:ps, (loc, loc'):completed) + completed' = if toRawPLI complLoc == rawLoc + then completed + else (rawLoc, complLoc):completed + pure (p:ps, completed') (revNew, revCompleted) <- foldM addPackage ([], []) newPackages let (newSingles, newMultiples) = partitionEithers @@ -1188,20 +1223,19 @@ addAndCompletePackagesToSnapshot source newPackages (AddPackagesConfig drops fla loadRawSnapshotLayer :: (HasPantryConfig env, HasLogFunc env) => RawSnapshotLocation - -> RIO env (Either WantedCompiler (RawSnapshotLayer, SHA256)) -- FIXME remove SHA? Be smart? + -> RIO env (Either WantedCompiler (RawSnapshotLayer, CompletedSL)) loadRawSnapshotLayer (RSLCompiler compiler) = pure $ Left compiler -loadRawSnapshotLayer sl@(RSLUrl url blob) = - handleAny (throwIO . InvalidSnapshot sl) $ do +loadRawSnapshotLayer rsl@(RSLUrl url blob) = + handleAny (throwIO . InvalidSnapshot rsl) $ do bs <- loadFromURL url blob value <- Yaml.decodeThrow bs - snapshot <- warningsParserHelperRaw sl value Nothing - pure $ Right (snapshot, SHA256.hashBytes bs) -loadRawSnapshotLayer sl@(RSLFilePath fp) = - handleAny (throwIO . InvalidSnapshot sl) $ do + snapshot <- warningsParserHelperRaw rsl value Nothing + pure $ Right (snapshot, (rsl, SLUrl url (bsToBlobKey bs))) +loadRawSnapshotLayer rsl@(RSLFilePath fp) = + handleAny (throwIO . InvalidSnapshot rsl) $ do value <- Yaml.decodeFileThrow $ toFilePath $ resolvedAbsolute fp - sha <- SHA256.hashFile $ toFilePath $ resolvedAbsolute fp - snapshot <- warningsParserHelperRaw sl value $ Just $ parent $ resolvedAbsolute fp - pure $ Right (snapshot, sha) + snapshot <- warningsParserHelperRaw rsl value $ Just $ parent $ resolvedAbsolute fp + pure $ Right (snapshot, (rsl, SLFilePath fp)) -- | Parse a 'SnapshotLayer' value from a 'SnapshotLocation'. -- @@ -1213,20 +1247,19 @@ loadRawSnapshotLayer sl@(RSLFilePath fp) = loadSnapshotLayer :: (HasPantryConfig env, HasLogFunc env) => SnapshotLocation - -> RIO env (Either WantedCompiler (RawSnapshotLayer, SHA256)) -- FIXME remove SHA? Be smart? + -> RIO env (Either WantedCompiler RawSnapshotLayer) loadSnapshotLayer (SLCompiler compiler) = pure $ Left compiler loadSnapshotLayer sl@(SLUrl url blob) = handleAny (throwIO . InvalidSnapshot (toRawSL sl)) $ do bs <- loadFromURL url (Just blob) value <- Yaml.decodeThrow bs snapshot <- warningsParserHelper sl value Nothing - pure $ Right (snapshot, SHA256.hashBytes bs) + pure $ Right snapshot loadSnapshotLayer sl@(SLFilePath fp) = handleAny (throwIO . InvalidSnapshot (toRawSL sl)) $ do value <- Yaml.decodeFileThrow $ toFilePath $ resolvedAbsolute fp - sha <- SHA256.hashFile $ toFilePath $ resolvedAbsolute fp snapshot <- warningsParserHelper sl value $ Just $ parent $ resolvedAbsolute fp - pure $ Right (snapshot, sha) + pure $ Right snapshot loadFromURL :: (HasPantryConfig env, HasLogFunc env) @@ -1339,7 +1372,7 @@ getRawPackageLocationTreeKey -> RIO env TreeKey getRawPackageLocationTreeKey pl = case getRawTreeKey pl of - Just treeKey -> pure treeKey + Just treeKey' -> pure treeKey' Nothing -> case pl of RPLIHackage pir _ -> getHackageTarballKey pir @@ -1375,6 +1408,9 @@ getTreeKey (PLIRepo _ pm) = pmTreeKey pm data PantryApp = PantryApp { paSimpleApp :: !SimpleApp , paPantryConfig :: !PantryConfig + , paUseColor :: !Bool + , paTermWidth :: !Int + , paStylesUpdate :: !StylesUpdate } simpleAppL :: Lens' PantryApp SimpleApp @@ -1389,6 +1425,11 @@ instance HasPantryConfig PantryApp where pantryConfigL = lens paPantryConfig (\x y -> x { paPantryConfig = y }) instance HasProcessContext PantryApp where processContextL = simpleAppL.processContextL +instance HasStylesUpdate PantryApp where + stylesUpdateL = lens paStylesUpdate (\x y -> x { paStylesUpdate = y }) +instance HasTerm PantryApp where + useColorL = lens paUseColor (\x y -> x { paUseColor = y }) + termWidthL = lens paTermWidth (\x y -> x { paTermWidth = y }) -- | Run some code against pantry using basic sane settings. -- @@ -1410,6 +1451,9 @@ runPantryApp f = runSimpleApp $ do PantryApp { paSimpleApp = sa , paPantryConfig = pc + , paTermWidth = 100 + , paUseColor = True + , paStylesUpdate = mempty } f @@ -1431,6 +1475,9 @@ runPantryAppClean f = liftIO $ withSystemTempDirectory "pantry-clean" $ \dir -> PantryApp { paSimpleApp = sa , paPantryConfig = pc + , paTermWidth = 100 + , paUseColor = True + , paStylesUpdate = mempty } f @@ -1438,17 +1485,17 @@ runPantryAppClean f = liftIO $ withSystemTempDirectory "pantry-clean" $ \dir -> -- -- @since 0.1.0.0 loadGlobalHints - :: HasTerm env - => Path Abs File -- ^ local cached file location - -> WantedCompiler + :: (HasTerm env, HasPantryConfig env) + => WantedCompiler -> RIO env (Maybe (Map PackageName Version)) -loadGlobalHints dest wc = +loadGlobalHints wc = inner False where inner alreadyDownloaded = do + dest <- getGlobalHintsFile req <- parseRequest "https://raw.githubusercontent.com/fpco/stackage-content/master/stack/global-hints.yaml" downloaded <- download req dest - eres <- tryAny inner2 + eres <- tryAny (inner2 dest) mres <- case eres of Left e -> Nothing <$ logError ("Error when parsing global hints: " <> displayShow e) @@ -1467,7 +1514,8 @@ loadGlobalHints dest wc = pure Nothing _ -> pure mres - inner2 = liftIO + inner2 dest + = liftIO $ Map.lookup wc . fmap (fmap unCabalString . unCabalStringMap) <$> Yaml.decodeFileThrow (toFilePath dest) diff --git a/subs/pantry/src/Pantry/Archive.hs b/subs/pantry/src/Pantry/Archive.hs index 4549445792..080f380e79 100644 --- a/subs/pantry/src/Pantry/Archive.hs +++ b/subs/pantry/src/Pantry/Archive.hs @@ -13,7 +13,7 @@ module Pantry.Archive import RIO import qualified Pantry.SHA256 as SHA256 -import Pantry.Storage +import Pantry.Storage hiding (Tree, TreeEntry) import Pantry.Tree import Pantry.Types import RIO.Process @@ -355,11 +355,11 @@ parseArchive rpli archive fp = do getFiles ats Right files -> pure (at, Map.fromList $ map (mePath &&& id) $ files []) (at :: ArchiveType, files :: Map FilePath MetaEntry) <- getFiles [minBound..maxBound] - let toSimple :: MetaEntry -> Either String SimpleEntry - toSimple me = + let toSimple :: FilePath -> MetaEntry -> Either String (Map FilePath SimpleEntry) + toSimple key me = case meType me of - METNormal -> Right $ SimpleEntry (mePath me) FTNormal - METExecutable -> Right $ SimpleEntry (mePath me) FTExecutable + METNormal -> Right $ Map.singleton key $ SimpleEntry (mePath me) FTNormal + METExecutable -> Right $ Map.singleton key $ SimpleEntry (mePath me) FTExecutable METLink relDest -> do case relDest of '/':_ -> Left $ concat @@ -393,17 +393,22 @@ parseArchive rpli archive fp = do , e ] Right x -> Right x + -- Check if it's a symlink to a file case Map.lookup dest files of - Nothing -> Left $ "Symbolic link dest not found from " ++ mePath me ++ " to " ++ relDest ++ ", looking for " ++ dest ++ ".\n" - ++ "This may indicate that the source is a git archive which uses git-annex.\n" - ++ "See https://github.com/commercialhaskell/stack/issues/4579 for further information." + Nothing -> + -- Check if it's a symlink to a directory + case findWithPrefix dest files of + [] -> Left $ "Symbolic link dest not found from " ++ mePath me ++ " to " ++ relDest ++ ", looking for " ++ dest ++ ".\n" + ++ "This may indicate that the source is a git archive which uses git-annex.\n" + ++ "See https://github.com/commercialhaskell/stack/issues/4579 for further information." + pairs -> fmap fold $ for pairs $ \(suffix, me') -> toSimple (key ++ '/' : suffix) me' Just me' -> case meType me' of - METNormal -> Right $ SimpleEntry dest FTNormal - METExecutable -> Right $ SimpleEntry dest FTExecutable + METNormal -> Right $ Map.singleton key $ SimpleEntry dest FTNormal + METExecutable -> Right $ Map.singleton key $ SimpleEntry dest FTExecutable METLink _ -> Left $ "Symbolic link dest cannot be a symbolic link, from " ++ mePath me ++ " to " ++ relDest - case traverse toSimple files of + case fold <$> Map.traverseWithKey toSimple files of Left e -> throwIO $ UnsupportedTarball loc $ T.pack e Right files1 -> do let files2 = stripCommonPrefix $ Map.toList files1 @@ -447,7 +452,7 @@ parseArchive rpli archive fp = do BFCabal _ _ -> when (buildFilePath /= cabalFileName name) $ throwIO $ WrongCabalFileName rpli buildFilePath name _ -> return () -- It's good! Store the tree, let's bounce - (tid, treeKey) <- withStorage $ storeTree rpli ident tree buildFile + (tid, treeKey') <- withStorage $ storeTree rpli ident tree buildFile packageCabal <- case buildFile of BFCabal _ _ -> pure $ PCCabalFile buildFileEntry BFHpack _ -> do @@ -458,12 +463,21 @@ parseArchive rpli archive fp = do let cabalTreeEntry = TreeEntry cabalKey (teType buildFileEntry) pure $ PCHpack $ PHpack { phOriginal = buildFileEntry, phGenerated = cabalTreeEntry, phVersion = hpackSoftwareVersion} pure Package - { packageTreeKey = treeKey + { packageTreeKey = treeKey' , packageTree = tree , packageCabalEntry = packageCabal , packageIdent = ident } +-- | Find all of the files in the Map with the given directory as a +-- prefix. Directory is given without trailing slash. Returns the +-- suffix after stripping the given prefix. +findWithPrefix :: FilePath -> Map FilePath MetaEntry -> [(FilePath, MetaEntry)] +findWithPrefix dir = mapMaybe go . Map.toList + where + prefix = dir ++ "/" + go (x, y) = (, y) <$> List.stripPrefix prefix x + findCabalOrHpackFile :: MonadThrow m => RawPackageLocationImmutable -- ^ for exceptions diff --git a/subs/pantry/src/Pantry/Hackage.hs b/subs/pantry/src/Pantry/Hackage.hs index eecf247237..fc02529ad0 100644 --- a/subs/pantry/src/Pantry/Hackage.hs +++ b/subs/pantry/src/Pantry/Hackage.hs @@ -4,7 +4,9 @@ {-# LANGUAGE ScopedTypeVariables #-} module Pantry.Hackage ( updateHackageIndex + , forceUpdateHackageIndex , DidUpdateOccur (..) + , RequireHackageIndex (..) , hackageIndexTarballL , getHackageTarball , getHackageTarballKey @@ -13,6 +15,7 @@ module Pantry.Hackage , getHackagePackageVersionRevisions , getHackageTypoCorrections , UsePreferredVersions (..) + , HackageTarballResult(..) ) where import RIO @@ -27,7 +30,7 @@ import qualified RIO.ByteString as B import qualified RIO.ByteString.Lazy as BL import Pantry.Archive import Pantry.Types hiding (FileType (..)) -import Pantry.Storage +import Pantry.Storage hiding (TreeEntry, PackageName, Version) import Pantry.Tree import qualified Pantry.SHA256 as SHA256 import Network.URI (parseURI) @@ -38,6 +41,7 @@ import qualified Distribution.PackageDescription as Cabal import System.IO (SeekMode (..)) import qualified Data.List.NonEmpty as NE import Data.Text.Metrics (damerauLevenshtein) +import Distribution.PackageDescription (GenericPackageDescription) import Distribution.Types.Version (versionNumbers) import Distribution.Types.VersionRange (withinRange) @@ -68,6 +72,17 @@ hackageIndexTarballL = hackageDirL.to ( indexRelFile) -- @since 0.1.0.0 data DidUpdateOccur = UpdateOccurred | NoUpdateOccurred + +-- | Information returned by `getHackageTarball` +-- +-- @since 0.1.0.0 +data HackageTarballResult = HackageTarballResult + { htrPackage :: !Package + -- ^ Package that was loaded from Hackage tarball + , htrFreshPackageInfo :: !(Maybe (GenericPackageDescription, TreeId)) + -- ^ This information is only available whenever package was just loaded into pantry. + } + -- | Download the most recent 01-index.tar file from Hackage and -- update the database tables. -- @@ -79,7 +94,26 @@ updateHackageIndex :: (HasPantryConfig env, HasLogFunc env) => Maybe Utf8Builder -- ^ reason for updating, if any -> RIO env DidUpdateOccur -updateHackageIndex mreason = do +updateHackageIndex = updateHackageIndexInternal False + +-- | Same as `updateHackageIndex`, but force the database update even if hackage +-- security tells that there is no change. This can be useful in order to make +-- sure the database is in sync with the locally downloaded tarball +-- +-- @since 0.1.0.0 +forceUpdateHackageIndex + :: (HasPantryConfig env, HasLogFunc env) + => Maybe Utf8Builder + -> RIO env DidUpdateOccur +forceUpdateHackageIndex = updateHackageIndexInternal True + + +updateHackageIndexInternal + :: (HasPantryConfig env, HasLogFunc env) + => Bool -- ^ Force the database update. + -> Maybe Utf8Builder -- ^ reason for updating, if any + -> RIO env DidUpdateOccur +updateHackageIndexInternal forceUpdate mreason = do storage <- view $ pantryConfigL.to pcStorage gateUpdate $ withWriteLock_ storage $ do for_ mreason logInfo @@ -117,12 +151,39 @@ updateHackageIndex mreason = do HS.checkForUpdates repo maybeNow case didUpdate of - HS.NoUpdates -> logInfo "No package index update available" + _ | forceUpdate -> do + logInfo "Forced package update is initialized" + updateCache tarball + HS.NoUpdates -> do + x <- needsCacheUpdate tarball + if x + then do + logInfo "No package index update available, but didn't update cache last time, running now" + updateCache tarball + else logInfo "No package index update available and cache up to date" HS.HasUpdates -> do logInfo "Updated package index downloaded" updateCache tarball logStickyDone "Package index cache populated" where + -- The size of the new index tarball, ignoring the required + -- (by the tar spec) 1024 null bytes at the end, which will be + -- mutated in the future by other updates. + getTarballSize :: MonadIO m => Handle -> m Word + getTarballSize h = (fromIntegral . max 0 . subtract 1024) <$> hFileSize h + + -- Check if the size of the tarball on the disk matches the value + -- in CacheUpdate. If not, we need to perform a cache update, even + -- if we didn't download any new information. This can be caused + -- by canceling an updateCache call. + needsCacheUpdate tarball = do + mres <- withStorage loadLatestCacheUpdate + case mres of + Nothing -> pure True + Just (FileSize cachedSize, _sha256) -> do + actualSize <- withBinaryFile (toFilePath tarball) ReadMode getTarballSize + pure $ cachedSize /= actualSize + -- This is the one action in the Pantry codebase known to hold a -- write lock on the database for an extended period of time. To -- avoid failures due to SQLite locks failing, we take our own @@ -152,10 +213,7 @@ updateHackageIndex mreason = do (offset, newHash, newSize) <- lift $ withBinaryFile (toFilePath tarball) ReadMode $ \h -> do logInfo "Calculating hashes to check for hackage-security rebases or filesystem changes" - -- The size of the new index tarball, ignoring the required - -- (by the tar spec) 1024 null bytes at the end, which will be - -- mutated in the future by other updates. - newSize :: Word <- (fromIntegral . max 0 . subtract 1024) <$> hFileSize h + newSize <- getTarballSize h let sinkSHA256 len = takeCE (fromIntegral len) .| SHA256.sinkHash case minfo of @@ -178,11 +236,13 @@ updateHackageIndex mreason = do if oldHash == oldHashCheck then oldSize <$ logInfo "Updating preexisting cache, should be quick" else 0 <$ do - logInfo "Package index change detected, that's pretty unusual" - logInfo $ "Old size: " <> display oldSize - logInfo $ "Old hash (orig) : " <> display oldHash - logInfo $ "New hash (check): " <> display oldHashCheck - logInfo "Forcing a recache" + logWarn $ mconcat [ + "Package index change detected, that's pretty unusual: " + , "\n Old size: " <> display oldSize + , "\n Old hash (orig) : " <> display oldHash + , "\n New hash (check): " <> display oldHashCheck + , "\n Forcing a recache" + ] pure (offset, newHash, newSize) lift $ logInfo $ "Populating cache from file size " <> display newSize <> ", hash " <> display newHash @@ -335,7 +395,7 @@ fuzzyLookupCandidates -> Version -> RIO env FuzzyResults fuzzyLookupCandidates name ver0 = do - m <- getHackagePackageVersions UsePreferredVersions name + m <- getHackagePackageVersions YesRequireHackageIndex UsePreferredVersions name if Map.null m then FRNameNotFound <$> getHackageTypoCorrections name else @@ -390,18 +450,37 @@ getHackageTypoCorrections name1 = data UsePreferredVersions = UsePreferredVersions | IgnorePreferredVersions deriving Show +-- | Require that the Hackage index is populated. +-- +-- @since 0.1.0.0 +data RequireHackageIndex + = YesRequireHackageIndex + -- ^ If there is nothing in the Hackage index, then perform an update + | NoRequireHackageIndex + -- ^ Do not perform an update + deriving Show + +initializeIndex + :: (HasPantryConfig env, HasLogFunc env) + => RequireHackageIndex + -> RIO env () +initializeIndex NoRequireHackageIndex = pure () +initializeIndex YesRequireHackageIndex = do + cabalCount <- withStorage countHackageCabals + when (cabalCount == 0) $ void $ + updateHackageIndex $ Just $ "No information from Hackage index, updating" + -- | Returns the versions of the package available on Hackage. -- -- @since 0.1.0.0 getHackagePackageVersions :: (HasPantryConfig env, HasLogFunc env) - => UsePreferredVersions + => RequireHackageIndex + -> UsePreferredVersions -> PackageName -- ^ package name -> RIO env (Map Version (Map Revision BlobKey)) -getHackagePackageVersions usePreferred name = do - cabalCount <- withStorage countHackageCabals - when (cabalCount == 0) $ void $ - updateHackageIndex $ Just $ "No information from Hackage index, updating" +getHackagePackageVersions req usePreferred name = do + initializeIndex req withStorage $ do mpreferred <- case usePreferred of @@ -420,13 +499,12 @@ getHackagePackageVersions usePreferred name = do -- @since 0.1.0.0 getHackagePackageVersionRevisions :: (HasPantryConfig env, HasLogFunc env) - => PackageName -- ^ package name + => RequireHackageIndex + -> PackageName -- ^ package name -> Version -- ^ package version -> RIO env (Map Revision BlobKey) -getHackagePackageVersionRevisions name version = do - cabalCount <- withStorage countHackageCabals - when (cabalCount == 0) $ void $ - updateHackageIndex $ Just $ "No information from Hackage index, updating" +getHackagePackageVersionRevisions req name version = do + initializeIndex req withStorage $ Map.map snd <$> loadHackagePackageVersion name version @@ -436,16 +514,17 @@ withCachedTree -> PackageName -> Version -> BlobId -- ^ cabal file contents - -> RIO env Package - -> RIO env Package + -> RIO env HackageTarballResult + -> RIO env HackageTarballResult withCachedTree rpli name ver bid inner = do mres <- withStorage $ loadHackageTree rpli name ver bid case mres of - Just package -> pure package + Just package -> pure $ HackageTarballResult package Nothing Nothing -> do - package <- inner - withStorage $ storeHackageTree name ver bid $ packageTreeKey package - pure package + htr <- inner + withStorage $ + storeHackageTree name ver bid $ packageTreeKey $ htrPackage htr + pure htr getHackageTarballKey :: (HasPantryConfig env, HasLogFunc env, HasProcessContext env) @@ -454,20 +533,21 @@ getHackageTarballKey getHackageTarballKey pir@(PackageIdentifierRevision name ver (CFIHash sha _msize)) = do mres <- withStorage $ loadHackageTreeKey name ver sha case mres of - Nothing -> packageTreeKey <$> getHackageTarball pir Nothing + Nothing -> packageTreeKey . htrPackage <$> getHackageTarball pir Nothing Just key -> pure key -getHackageTarballKey pir = packageTreeKey <$> getHackageTarball pir Nothing +getHackageTarballKey pir = packageTreeKey . htrPackage <$> getHackageTarball pir Nothing getHackageTarball :: (HasPantryConfig env, HasLogFunc env, HasProcessContext env) => PackageIdentifierRevision -> Maybe TreeKey - -> RIO env Package -getHackageTarball pir@(PackageIdentifierRevision name ver _cfi) mtreeKey = do + -> RIO env HackageTarballResult +getHackageTarball pir mtreeKey = do + let PackageIdentifierRevision name ver _cfi = pir cabalFile <- resolveCabalFileInfo pir - cabalFileKey <- withStorage $ getBlobKey cabalFile let rpli = RPLIHackage pir mtreeKey withCachedTree rpli name ver cabalFile $ do + cabalFileKey <- withStorage $ getBlobKey cabalFile mpair <- withStorage $ loadHackageTarballInfo name ver (sha, size) <- case mpair of @@ -484,55 +564,66 @@ getHackageTarball pir@(PackageIdentifierRevision name ver _cfi) mtreeKey = do Just pair2 -> pure pair2 pc <- view pantryConfigL let urlPrefix = hscDownloadPrefix $ pcHackageSecurity pc - url = mconcat - [ urlPrefix - , "package/" - , T.pack $ Distribution.Text.display name - , "-" - , T.pack $ Distribution.Text.display ver - , ".tar.gz" - ] - package <- getArchivePackage - rpli - RawArchive - { raLocation = ALUrl url - , raHash = Just sha - , raSize = Just size - , raSubdir = T.empty -- no subdirs on Hackage - } - RawPackageMetadata - { rpmName = Just name - , rpmVersion = Just ver - , rpmTreeKey = Nothing -- with a revision cabal file will differ giving a different tree - , rpmCabal = Nothing -- cabal file in the tarball may be different! - } - + url = + mconcat + [ urlPrefix + , "package/" + , T.pack $ Distribution.Text.display name + , "-" + , T.pack $ Distribution.Text.display ver + , ".tar.gz" + ] + package <- + getArchivePackage + rpli + RawArchive + { raLocation = ALUrl url + , raHash = Just sha + , raSize = Just size + , raSubdir = T.empty -- no subdirs on Hackage + } + RawPackageMetadata + { rpmName = Just name + , rpmVersion = Just ver + , rpmTreeKey = Nothing -- with a revision cabal file will differ giving a different tree + , rpmCabal = Nothing -- cabal file in the tarball may be different! + } case packageTree package of TreeMap m -> do - let (PCCabalFile (TreeEntry _ ft)) = packageCabalEntry package + let ft = + case packageCabalEntry package of + PCCabalFile (TreeEntry _ ft') -> ft' + _ -> error "Impossible: Hackage does not support hpack" cabalEntry = TreeEntry cabalFileKey ft tree' = TreeMap $ Map.insert (cabalFileName name) cabalEntry m ident = PackageIdentifier name ver - - cabalBS <- withStorage $ do - let BlobKey sha' _ = cabalFileKey - mcabalBS <- loadBlobBySHA sha' - case mcabalBS of - Nothing -> error $ "Invariant violated, cabal file key: " ++ show cabalFileKey - Just bid -> loadBlobById bid - + cabalBS <- + withStorage $ do + let BlobKey sha' _ = cabalFileKey + mcabalBS <- loadBlobBySHA sha' + case mcabalBS of + Nothing -> + error $ + "Invariant violated, cabal file key: " ++ show cabalFileKey + Just bid -> loadBlobById bid (_warnings, gpd) <- rawParseGPD (Left rpli) cabalBS let gpdIdent = Cabal.package $ Cabal.packageDescription gpd - when (ident /= gpdIdent) $ throwIO $ - MismatchedCabalFileForHackage pir Mismatch - { mismatchExpected = ident - , mismatchActual = gpdIdent + when (ident /= gpdIdent) $ + throwIO $ + MismatchedCabalFileForHackage + pir + Mismatch {mismatchExpected = ident, mismatchActual = gpdIdent} + (tid, treeKey') <- + withStorage $ + storeTree rpli ident tree' (BFCabal (cabalFileName name) cabalEntry) + pure + HackageTarballResult + { htrPackage = + Package + { packageTreeKey = treeKey' + , packageTree = tree' + , packageIdent = ident + , packageCabalEntry = PCCabalFile cabalEntry + } + , htrFreshPackageInfo = Just (gpd, tid) } - - (_tid, treeKey') <- withStorage $ storeTree rpli ident tree' (BFCabal (cabalFileName name) cabalEntry) - pure Package - { packageTreeKey = treeKey' - , packageTree = tree' - , packageIdent = ident - , packageCabalEntry = PCCabalFile cabalEntry - } diff --git a/subs/pantry/src/Pantry/Internal.hs b/subs/pantry/src/Pantry/Internal.hs index be603a94f9..0ec4f118ad 100644 --- a/subs/pantry/src/Pantry/Internal.hs +++ b/subs/pantry/src/Pantry/Internal.hs @@ -9,32 +9,39 @@ module Pantry.Internal , pcHpackExecutable , normalizeParents , makeTarRelative + , getGlobalHintsFile ) where import Control.Exception (assert) import Pantry.Types import qualified Data.Text as T +import Data.Maybe (fromMaybe) -- | Like @System.FilePath.normalise@, however: -- -- * Only works on relative paths, absolute paths fail -- --- * May not point to directories +-- * Strips trailing slashes -- -- * Only works on forward slashes, even on Windows -- -- * Normalizes parent dirs @foo/../@ get stripped -- +-- * Cannot begin with a parent directory (@../@) +-- -- * Spelled like an American, sorry normalizeParents :: FilePath -> Either String FilePath normalizeParents "" = Left "empty file path" normalizeParents ('/':_) = Left "absolute path" +normalizeParents ('.':'.':'/':_) = Left "absolute path" normalizeParents fp = do - let t = T.pack fp + -- Strip a single trailing, but not multiple + let t0 = T.pack fp + t = fromMaybe t0 $ T.stripSuffix "/" t0 case T.unsnoc t of - Just (_, '/') -> Left "trailing slash" + Just (_, '/') -> Left "multiple trailing slashes" _ -> Right () let c1 = T.split (== '/') t diff --git a/subs/pantry/src/Pantry/Internal/Companion.hs b/subs/pantry/src/Pantry/Internal/Companion.hs new file mode 100644 index 0000000000..53b9ec33ef --- /dev/null +++ b/subs/pantry/src/Pantry/Internal/Companion.hs @@ -0,0 +1,76 @@ +{-# LANGUAGE RankNTypes #-} +{-# LANGUAGE NoImplicitPrelude #-} +-- | Companion threads, such as for printing messages saying we're +-- still busy. Ultimately this could be put into its own package. +module Pantry.Internal.Companion + ( withCompanion + , onCompanionDone + , Companion + , Delay + , StopCompanion + ) where + +import RIO + +-- | A companion thread which can perform arbitrary actions as well as delay +type Companion m = Delay -> m () + +-- | Delay the given number of microseconds. If 'StopCompanion' is +-- triggered before the timer completes, a 'CompanionDone' exception +-- will be thrown (which is caught internally by 'withCompanion'). +type Delay = forall mio. MonadIO mio => Int -> mio () + +-- | Tell the 'Companion' to stop. The next time 'Delay' is +-- called, or if a 'Delay' is currently blocking, the 'Companion' thread +-- will exit with a 'CompanionDone' exception. +type StopCompanion m = m () + +-- | When a delay was interrupted because we're told to stop, perform +-- this action. +onCompanionDone + :: MonadUnliftIO m + => m () -- ^ the delay + -> m () -- ^ action to perform + -> m () +onCompanionDone theDelay theAction = + theDelay `withException` \CompanionDone -> theAction + +-- | Internal exception used by 'withCompanion' to allow short-circuiting +-- of the 'Companion'. Should not be used outside of this module. +data CompanionDone = CompanionDone + deriving (Show, Typeable) +instance Exception CompanionDone + +-- | Keep running the 'Companion' action until either the inner action +-- completes or calls the 'StopCompanion' action. This can be used to +-- give the user status information while running a long running +-- operations. +withCompanion + :: forall m a. MonadUnliftIO m + => Companion m + -> (StopCompanion m -> m a) + -> m a +withCompanion companion inner = do + -- Variable to indicate 'Delay'ing should result in a 'CompanionDone' + -- exception. + shouldStopVar <- newTVarIO False + let -- Relatively simple: set shouldStopVar to True + stopCompanion = atomically $ writeTVar shouldStopVar True + + delay :: Delay + delay usec = do + -- Register a delay with the runtime system + delayDoneVar <- registerDelay usec + join $ atomically $ + -- Delay has triggered, keep going + (pure () <$ (readTVar delayDoneVar >>= checkSTM)) <|> + -- Time to stop the companion, throw a 'CompanionDone' exception immediately + (throwIO CompanionDone <$ (readTVar shouldStopVar >>= checkSTM)) + + -- Run the 'Companion' and inner action together + runConcurrently $ + -- Ignore a 'CompanionDone' exception from the companion, that's expected behavior + Concurrently (companion delay `catch` \CompanionDone -> pure ()) *> + -- Run the inner action, giving it the 'StopCompanion' action, and + -- ensuring it is called regardless of exceptions. + Concurrently (inner stopCompanion `finally` stopCompanion) diff --git a/subs/pantry/src/Pantry/Internal/Stackage.hs b/subs/pantry/src/Pantry/Internal/Stackage.hs new file mode 100644 index 0000000000..1a76a2c5d9 --- /dev/null +++ b/subs/pantry/src/Pantry/Internal/Stackage.hs @@ -0,0 +1,51 @@ +-- | All types and functions exported from this module are for advanced usage +-- only. They are needed for stackage-server integration with pantry. +module Pantry.Internal.Stackage + ( module X + ) where + +import Pantry.Hackage as X + ( forceUpdateHackageIndex + , getHackageTarball + , HackageTarballResult(..) + ) +import Pantry.Storage as X + ( BlobId + , EntityField(..) + , HackageCabalId + , ModuleNameId + , PackageName + , PackageNameId + , Tree(..) + , TreeEntry(..) + , TreeEntryId + , TreeId + , Unique(..) + , Version + , VersionId + , getBlobKey + , getPackageNameById + , getPackageNameId + , getTreeForKey + , getVersionId + , loadBlobById + , migrateAll + , treeCabal + , Key(unBlobKey) + ) +import Pantry.Types as X + ( ModuleNameP(..) + , PackageNameP(..) + , PantryConfig(..) + , SafeFilePath + , Storage(..) + , VersionP(..) + , mkSafeFilePath + , packageNameString + , packageTreeKey + , parsePackageName + , parseVersion + , parseVersionThrowing + , unSafeFilePath + , versionString + ) diff --git a/subs/pantry/src/Pantry/SQLite.hs b/subs/pantry/src/Pantry/SQLite.hs index d2413c7e27..9a37922da9 100644 --- a/subs/pantry/src/Pantry/SQLite.hs +++ b/subs/pantry/src/Pantry/SQLite.hs @@ -1,5 +1,6 @@ {-# LANGUAGE NoImplicitPrelude #-} {-# LANGUAGE OverloadedStrings #-} +{-# LANGUAGE ScopedTypeVariables #-} module Pantry.SQLite ( Storage (..) , initStorage @@ -12,6 +13,7 @@ import Path (Path, Abs, File, toFilePath, parent) import Path.IO (ensureDir) import Pantry.Types (PantryException (MigrationFailure), Storage (..)) import System.FileLock (withFileLock, withTryFileLock, SharedExclusive (..)) +import Pantry.Internal.Companion initStorage :: HasLogFunc env @@ -23,14 +25,31 @@ initStorage initStorage description migration fp inner = do ensureDir $ parent fp - migrates <- withWriteLock fp $ wrapMigrationFailure $ + migrates <- withWriteLock (display description) fp $ wrapMigrationFailure $ withSqliteConnInfo (sqinfo True) $ runReaderT $ runMigrationSilent migration forM_ migrates $ \mig -> logDebug $ "Migration executed: " <> display mig + -- This addresses a weird race condition that can result in a + -- deadlock. If multiple threads in the same process try to access + -- the database, it's possible that they will end up deadlocking on + -- the file lock, due to delays which can occur in the freeing of + -- previous locks. I don't fully grok the situation yet, but + -- introducing an MVar to ensure that, within a process, only one + -- thread is attempting to lock the file is both a valid workaround + -- and good practice. + baton <- newMVar () + withSqlitePoolInfo (sqinfo False) 1 $ \pool -> inner $ Storage - { withStorage_ = flip runSqlPool pool - , withWriteLock_ = withWriteLock fp + -- NOTE: Currently, we take a write lock on every action. This is + -- a bit heavyweight, but it avoids the SQLITE_BUSY errors + -- reported in + -- + -- completely. We can investigate more elegant solutions in the + -- future, such as separate read and write actions or introducing + -- smarter retry logic. + { withStorage_ = withMVar baton . const . withWriteLock (display description) fp . flip runSqlPool pool + , withWriteLock_ = id } where wrapMigrationFailure = handleAny (throwIO . MigrationFailure description fp) @@ -52,33 +71,30 @@ initStorage description migration fp inner = do -- above. withWriteLock :: HasLogFunc env - => Path Abs File -- ^ SQLite database file + => Utf8Builder -- ^ database description, for lock messages + -> Path Abs File -- ^ SQLite database file -> RIO env a -> RIO env a -withWriteLock dbFile inner = do +withWriteLock desc dbFile inner = do let lockFile = toFilePath dbFile ++ ".pantry-write-lock" withRunInIO $ \run -> do mres <- withTryFileLock lockFile Exclusive $ const $ run inner case mres of Just res -> pure res Nothing -> do - run $ logInfo "Unable to get a write lock on the Pantry database, waiting..." - shouldStopComplainingVar <- newTVarIO False - let complainer = fix $ \loop -> do - delay <- registerDelay $ 60 * 1000 * 1000 -- 1 minute - shouldComplain <- - atomically $ - -- Delay has triggered, time to complain again - (readTVar delay >>= checkSTM >> pure True) <|> - -- Time to stop complaining, ignore that delay immediately - (readTVar shouldStopComplainingVar >>= checkSTM >> pure False) - when shouldComplain $ do - run $ logWarn "Still waiting on the Pantry database write lock..." - loop - stopComplaining = atomically $ writeTVar shouldStopComplainingVar True - worker = withFileLock lockFile Exclusive $ const $ do - run $ logInfo "Acquired the Pantry database write lock" - stopComplaining - run inner - runConcurrently $ Concurrently complainer - *> Concurrently (worker `finally` stopComplaining) + let complainer :: Companion IO + complainer delay = run $ do + -- Wait five seconds before giving the first message to + -- avoid spamming the user for uninteresting file locks + delay $ 5 * 1000 * 1000 -- 5 seconds + logInfo $ "Unable to get a write lock on the " <> desc <> " database, waiting..." + + -- Now loop printing a message every 1 minute + forever $ do + delay (60 * 1000 * 1000) -- 1 minute + `onCompanionDone` logInfo ("Acquired the " <> desc <> " database write lock") + logWarn ("Still waiting on the " <> desc <> " database write lock...") + withCompanion complainer $ \stopComplaining -> + withFileLock lockFile Exclusive $ const $ do + stopComplaining + run inner diff --git a/subs/pantry/src/Pantry/Storage.hs b/subs/pantry/src/Pantry/Storage.hs index 539d28c5df..eeecb8d75f 100644 --- a/subs/pantry/src/Pantry/Storage.hs +++ b/subs/pantry/src/Pantry/Storage.hs @@ -1,3 +1,4 @@ +{-# LANGUAGE NamedFieldPuns #-} {-# LANGUAGE NoImplicitPrelude #-} {-# LANGUAGE BangPatterns #-} {-# LANGUAGE TemplateHaskell #-} @@ -9,10 +10,12 @@ {-# LANGUAGE GADTs #-} {-# LANGUAGE OverloadedStrings #-} {-# LANGUAGE ViewPatterns #-} +{-# LANGUAGE LambdaCase #-} module Pantry.Storage ( SqlBackend , initStorage , withStorage + , migrateAll , storeBlob , loadBlob , loadBlobById @@ -33,6 +36,9 @@ module Pantry.Storage , loadTree , storeHPack , loadPackageById + , getPackageNameById + , getPackageNameId + , getVersionId , getTreeForKey , storeHackageTree , loadHackageTree @@ -51,14 +57,24 @@ module Pantry.Storage , getSnapshotCacheId , storeSnapshotModuleCache , loadExposedModulePackages - + , PackageNameId + , PackageName + , VersionId + , ModuleNameId + , Version + , Unique(..) + , EntityField(..) -- avoid warnings , BlobId + , Key(unBlobKey) , HackageCabalId + , HackageCabal(..) , HackageTarballId , CacheUpdateId , FilePathId + , Tree(..) , TreeId + , TreeEntry(..) , TreeEntryId , ArchiveCacheId , RepoCacheId @@ -243,27 +259,46 @@ withStorage withStorage action = flip SQLite.withStorage_ action =<< view (P.pantryConfigL.to P.pcStorage) +-- | This is a helper type to distinguish db queries between different rdbms backends. The important +-- part is that the affects described in this data type should be semantically equivalent between +-- the supported engines. +data RdbmsActions env a = RdbmsActions + { raSqlite :: !(ReaderT SqlBackend (RIO env) a) + -- ^ A query that is specific to SQLite + , raPostgres :: !(ReaderT SqlBackend (RIO env) a) + -- ^ A query that is specific to PostgreSQL + } + +-- | This function provides a way to create queries supported by multiple sql backends. +rdbmsAwareQuery + :: RdbmsActions env a + -> ReaderT SqlBackend (RIO env) a +rdbmsAwareQuery RdbmsActions {raSqlite, raPostgres} = do + rdbms <- connRDBMS <$> ask + case rdbms of + "postgresql" -> raPostgres + "sqlite" -> raSqlite + _ -> error $ "rdbmsAwareQuery: unsupported rdbms '" ++ T.unpack rdbms ++ "'" + + +getPackageNameById + :: PackageNameId + -> ReaderT SqlBackend (RIO env) (Maybe P.PackageName) +getPackageNameById = fmap (unPackageNameP . packageNameName <$>) . get + + getPackageNameId - :: (HasPantryConfig env, HasLogFunc env) - => P.PackageName + :: P.PackageName -> ReaderT SqlBackend (RIO env) PackageNameId getPackageNameId = fmap (either entityKey id) . insertBy . PackageName . PackageNameP getVersionId - :: (HasPantryConfig env, HasLogFunc env) - => P.Version + :: P.Version -> ReaderT SqlBackend (RIO env) VersionId getVersionId = fmap (either entityKey id) . insertBy . Version . VersionP -getFilePathId - :: (HasPantryConfig env, HasLogFunc env) - => SafeFilePath - -> ReaderT SqlBackend (RIO env) FilePathId -getFilePathId = fmap (either entityKey id) . insertBy . FilePath - storeBlob - :: (HasPantryConfig env, HasLogFunc env) - => ByteString + :: ByteString -> ReaderT SqlBackend (RIO env) (BlobId, BlobKey) storeBlob bs = do let sha = SHA256.hashBytes bs @@ -271,16 +306,31 @@ storeBlob bs = do keys <- selectKeysList [BlobSha ==. sha] [] key <- case keys of - [] -> insert Blob - { blobSha = sha - , blobSize = size - , blobContents = bs - } + [] -> + rdbmsAwareQuery + RdbmsActions + { raSqlite = + insert Blob {blobSha = sha, blobSize = size, blobContents = bs} + , raPostgres = + do rawExecute + "INSERT INTO blob(sha, size, contents) VALUES (?, ?, ?) ON CONFLICT DO NOTHING" + [ toPersistValue sha + , toPersistValue size + , toPersistValue bs + ] + rawSql + "SELECT blob.id FROM blob WHERE blob.sha = ?" + [toPersistValue sha] >>= \case + [Single key] -> pure key + _ -> + error + "soreBlob: there was a critical problem storing a blob." + } key:rest -> assert (null rest) (pure key) pure (key, P.BlobKey sha size) -loadBlob - :: (HasPantryConfig env, HasLogFunc env) +loadBlob :: + HasLogFunc env => BlobKey -> ReaderT SqlBackend (RIO env) (Maybe ByteString) loadBlob (P.BlobKey sha size) = do @@ -295,27 +345,17 @@ loadBlob (P.BlobKey sha size) = do ". Expected size: " <> display size <> ". Actual size: " <> display (blobSize bt)) -loadBlobBySHA - :: (HasPantryConfig env, HasLogFunc env) - => SHA256 - -> ReaderT SqlBackend (RIO env) (Maybe BlobId) +loadBlobBySHA :: SHA256 -> ReaderT SqlBackend (RIO env) (Maybe BlobId) loadBlobBySHA sha = listToMaybe <$> selectKeysList [BlobSha ==. sha] [] -loadBlobById - :: (HasPantryConfig env, HasLogFunc env) - => BlobId - -> ReaderT SqlBackend (RIO env) ByteString +loadBlobById :: BlobId -> ReaderT SqlBackend (RIO env) ByteString loadBlobById bid = do mbt <- get bid case mbt of Nothing -> error "loadBlobById: ID doesn't exist in database" Just bt -> pure $ blobContents bt - -getBlobKey - :: (HasPantryConfig env, HasLogFunc env) - => BlobId - -> ReaderT SqlBackend (RIO env) BlobKey +getBlobKey :: BlobId -> ReaderT SqlBackend (RIO env) BlobKey getBlobKey bid = do res <- rawSql "SELECT sha, size FROM blob WHERE id=?" [toPersistValue bid] case res of @@ -323,19 +363,13 @@ getBlobKey bid = do [(Single sha, Single size)] -> pure $ P.BlobKey sha size _ -> error $ "getBlobKey failed due to non-unique ID: " ++ show (bid, res) -getBlobId - :: (HasPantryConfig env, HasLogFunc env) - => BlobKey - -> ReaderT SqlBackend (RIO env) (Maybe BlobId) +getBlobId :: BlobKey -> ReaderT SqlBackend (RIO env) (Maybe BlobId) getBlobId (P.BlobKey sha size) = do res <- rawSql "SELECT id FROM blob WHERE sha=? AND size=?" [toPersistValue sha, toPersistValue size] pure $ listToMaybe $ map unSingle res -loadURLBlob - :: (HasPantryConfig env, HasLogFunc env) - => Text - -> ReaderT SqlBackend (RIO env) (Maybe ByteString) +loadURLBlob :: Text -> ReaderT SqlBackend (RIO env) (Maybe ByteString) loadURLBlob url = do ment <- rawSql "SELECT blob.contents\n\ @@ -348,11 +382,7 @@ loadURLBlob url = do [] -> pure Nothing (Single bs) : _ -> pure $ Just bs -storeURLBlob - :: (HasPantryConfig env, HasLogFunc env) - => Text - -> ByteString - -> ReaderT SqlBackend (RIO env) () +storeURLBlob :: Text -> ByteString -> ReaderT SqlBackend (RIO env) () storeURLBlob url blob = do (blobId, _) <- storeBlob blob now <- getCurrentTime @@ -362,17 +392,11 @@ storeURLBlob url blob = do , urlBlobTime = now } -clearHackageRevisions - :: (HasPantryConfig env, HasLogFunc env) - => ReaderT SqlBackend (RIO env) () +clearHackageRevisions :: ReaderT SqlBackend (RIO env) () clearHackageRevisions = deleteWhere ([] :: [Filter HackageCabal]) -storeHackageRevision - :: (HasPantryConfig env, HasLogFunc env) - => P.PackageName - -> P.Version - -> BlobId - -> ReaderT SqlBackend (RIO env) () +storeHackageRevision :: + P.PackageName -> P.Version -> BlobId -> ReaderT SqlBackend (RIO env) () storeHackageRevision name version key = do nameid <- getPackageNameId name versionid <- getVersionId version @@ -389,8 +413,7 @@ storeHackageRevision name version key = do } loadHackagePackageVersions - :: (HasPantryConfig env, HasLogFunc env) - => P.PackageName + :: P.PackageName -> ReaderT SqlBackend (RIO env) (Map P.Version (Map Revision BlobKey)) loadHackagePackageVersions name = do nameid <- getPackageNameId name @@ -407,8 +430,7 @@ loadHackagePackageVersions name = do (version, Map.singleton revision (P.BlobKey key size)) loadHackagePackageVersion - :: (HasPantryConfig env, HasLogFunc env) - => P.PackageName + :: P.PackageName -> P.Version -> ReaderT SqlBackend (RIO env) (Map Revision (BlobId, P.BlobKey)) loadHackagePackageVersion name version = do @@ -427,18 +449,13 @@ loadHackagePackageVersion name version = do (revision, (bid, P.BlobKey sha size)) loadLatestCacheUpdate - :: (HasPantryConfig env, HasLogFunc env) - => ReaderT SqlBackend (RIO env) (Maybe (FileSize, SHA256)) + :: ReaderT SqlBackend (RIO env) (Maybe (FileSize, SHA256)) loadLatestCacheUpdate = fmap go <$> selectFirst [] [Desc CacheUpdateTime] where go (Entity _ cu) = (cacheUpdateSize cu, cacheUpdateSha cu) -storeCacheUpdate - :: (HasPantryConfig env, HasLogFunc env) - => FileSize - -> SHA256 - -> ReaderT SqlBackend (RIO env) () +storeCacheUpdate :: FileSize -> SHA256 -> ReaderT SqlBackend (RIO env) () storeCacheUpdate size sha = do now <- getCurrentTime insert_ CacheUpdate @@ -448,8 +465,7 @@ storeCacheUpdate size sha = do } storeHackageTarballInfo - :: (HasPantryConfig env, HasLogFunc env) - => P.PackageName + :: P.PackageName -> P.Version -> SHA256 -> FileSize @@ -465,8 +481,7 @@ storeHackageTarballInfo name version sha size = do } loadHackageTarballInfo - :: (HasPantryConfig env, HasLogFunc env) - => P.PackageName + :: P.PackageName -> P.Version -> ReaderT SqlBackend (RIO env) (Maybe (SHA256, FileSize)) loadHackageTarballInfo name version = do @@ -477,8 +492,7 @@ loadHackageTarballInfo name version = do go (Entity _ ht) = (hackageTarballSha ht, hackageTarballSize ht) storeCabalFile :: - (HasPantryConfig env, HasLogFunc env, HasProcessContext env) - => ByteString + ByteString -> P.PackageName -> ReaderT SqlBackend (RIO env) BlobId storeCabalFile cabalBS pkgName = do @@ -488,8 +502,7 @@ storeCabalFile cabalBS pkgName = do return bid loadFilePath :: - (HasPantryConfig env, HasLogFunc env) - => SafeFilePath + SafeFilePath -> ReaderT SqlBackend (RIO env) (Entity FilePath) loadFilePath path = do fp <- getBy $ UniqueSfp path @@ -500,18 +513,18 @@ loadFilePath path = do (T.unpack $ P.unSafeFilePath path) Just record -> return record -loadHPackTreeEntity :: (HasPantryConfig env, HasLogFunc env) => TreeId -> ReaderT SqlBackend (RIO env) (Entity TreeEntry) +loadHPackTreeEntity :: TreeId -> ReaderT SqlBackend (RIO env) (Entity TreeEntry) loadHPackTreeEntity tid = do filepath <- loadFilePath P.hpackSafeFilePath let filePathId :: FilePathId = entityKey filepath hpackTreeEntry <- - selectFirst [TreeEntryTree ==. tid, TreeEntryPath ==. filePathId] [] - hpackEntity <- - case hpackTreeEntry of - Nothing -> - error $ "loadHPackTreeEntity: No package.yaml file found in TreeEntry for TreeId: " ++ (show tid) - Just record -> return record - return hpackEntity + selectFirst [TreeEntryTree ==. tid, TreeEntryPath ==. filePathId] [] + case hpackTreeEntry of + Nothing -> + error $ + "loadHPackTreeEntity: No package.yaml file found in TreeEntry for TreeId: " ++ + show tid + Just record -> return record storeHPack :: (HasPantryConfig env, HasLogFunc env, HasProcessContext env) @@ -525,7 +538,7 @@ storeHPack rpli tid = do Nothing -> generateHPack rpli tid vid Just record -> return $ entityKey record -loadCabalBlobKey :: (HasPantryConfig env, HasLogFunc env) => HPackId -> ReaderT SqlBackend (RIO env) BlobKey +loadCabalBlobKey :: HPackId -> ReaderT SqlBackend (RIO env) BlobKey loadCabalBlobKey hpackId = do hpackRecord <- getJust hpackId getBlobKey $ hPackCabalBlob hpackRecord @@ -556,11 +569,39 @@ hpackVersionId :: (HasPantryConfig env, HasLogFunc env, HasProcessContext env) => ReaderT SqlBackend (RIO env) VersionId hpackVersionId = do - hpackSoftwareVersion <- lift $ hpackVersion + hpackSoftwareVersion <- lift hpackVersion fmap (either entityKey id) $ insertBy $ Version {versionVersion = P.VersionP hpackSoftwareVersion} + +getFilePathId :: SafeFilePath -> ReaderT SqlBackend (RIO env) FilePathId +getFilePathId sfp = + selectKeysList [FilePathPath ==. sfp] [] >>= \case + [fpId] -> pure fpId + [] -> + rdbmsAwareQuery + RdbmsActions + { raSqlite = insert $ FilePath sfp + , raPostgres = + do rawExecute + "INSERT INTO file_path(path) VALUES (?) ON CONFLICT DO NOTHING" + [toPersistValue sfp] + rawSql + "SELECT id FROM file_path WHERE path = ?" + [toPersistValue sfp] >>= \case + [Single key] -> pure key + _ -> + error + "getFilePathId: there was a critical problem storing a blob." + } + _ -> + error $ + "getFilePathId: FilePath unique constraint key is violated for: " ++ fp + where + fp = T.unpack (P.unSafeFilePath sfp) + + storeTree :: (HasPantryConfig env, HasLogFunc env, HasProcessContext env) => P.RawPackageLocationImmutable -- ^ for exceptions @@ -611,9 +652,7 @@ storeTree rpli (P.PackageIdentifier name version) tree@(P.TreeMap m) buildFile = P.BFCabal _ _ -> return () return (tid, pTreeKey) -getTree :: (HasPantryConfig env, HasLogFunc env) - => TreeId - -> ReaderT SqlBackend (RIO env) P.Tree +getTree :: TreeId -> ReaderT SqlBackend (RIO env) P.Tree getTree tid = do (mts :: Maybe Tree) <- get tid ts <- @@ -623,10 +662,7 @@ getTree tid = do Just ts -> pure ts loadTreeByEnt $ Entity tid ts -loadTree - :: (HasPantryConfig env, HasLogFunc env) - => P.TreeKey - -> ReaderT SqlBackend (RIO env) (Maybe P.Tree) +loadTree :: P.TreeKey -> ReaderT SqlBackend (RIO env) (Maybe P.Tree) loadTree key = do ment <- getTreeForKey key case ment of @@ -634,8 +670,7 @@ loadTree key = do Just ent -> Just <$> loadTreeByEnt ent getTreeForKey - :: (HasPantryConfig env, HasLogFunc env) - => TreeKey + :: TreeKey -> ReaderT SqlBackend (RIO env) (Maybe (Entity Tree)) getTreeForKey (P.TreeKey key) = do mbid <- getBlobId key @@ -672,8 +707,8 @@ loadPackageById rpli tid = do "loadPackageByid: invalid foreign key " ++ show (treeVersion ts) Just (Version (P.VersionP version)) -> pure version let ident = P.PackageIdentifier name version - (pentry, mtree) <- - case (treeCabal ts) of + (packageEntry, mtree) <- + case treeCabal ts of Just keyBlob -> do cabalKey <- getBlobKey keyBlob return @@ -700,17 +735,17 @@ loadPackageById rpli tid = do Package { packageTreeKey = P.TreeKey blobKey , packageTree = mtree - , packageCabalEntry = pentry + , packageCabalEntry = packageEntry , packageIdent = ident } -getHPackBlobKey :: (HasPantryConfig env, HasLogFunc env) => HPack -> ReaderT SqlBackend (RIO env) BlobKey +getHPackBlobKey :: HPack -> ReaderT SqlBackend (RIO env) BlobKey getHPackBlobKey hpackRecord = do let treeId = hPackTree hpackRecord hpackEntity <- loadHPackTreeEntity treeId getBlobKey (treeEntryBlob $ entityVal hpackEntity) -getHPackBlobKeyById :: (HasPantryConfig env, HasLogFunc env) => HPackId -> ReaderT SqlBackend (RIO env) BlobKey +getHPackBlobKeyById :: HPackId -> ReaderT SqlBackend (RIO env) BlobKey getHPackBlobKeyById hpackId = do hpackRecord <- getJust hpackId getHPackBlobKey hpackRecord @@ -731,7 +766,7 @@ getHPackCabalFile hpackRecord ts tmap cabalFile = do cbTreeEntry = P.TreeEntry cabalKey fileType hpackTreeEntry = P.TreeEntry hpackKey fileType tree = P.TreeMap $ Map.insert cabalFile cbTreeEntry tmap - return $ + return ( P.PCHpack $ P.PHpack { P.phOriginal = hpackTreeEntry @@ -740,10 +775,7 @@ getHPackCabalFile hpackRecord ts tmap cabalFile = do } , tree) -loadTreeByEnt - :: (HasPantryConfig env, HasLogFunc env) - => Entity Tree - -> ReaderT SqlBackend (RIO env) P.Tree +loadTreeByEnt :: Entity Tree -> ReaderT SqlBackend (RIO env) P.Tree loadTreeByEnt (Entity tid _t) = do entries <- rawSql "SELECT file_path.path, blob.sha, blob.size, tree_entry.type\n\ @@ -758,8 +790,7 @@ loadTreeByEnt (Entity tid _t) = do entries storeHackageTree - :: (HasPantryConfig env, HasLogFunc env) - => P.PackageName + :: P.PackageName -> P.Version -> BlobId -> P.TreeKey @@ -776,8 +807,7 @@ storeHackageTree name version cabal treeKey' = do [HackageCabalTree =. Just (entityKey ent)] loadHackageTreeKey - :: (HasPantryConfig env, HasLogFunc env) - => P.PackageName + :: P.PackageName -> P.Version -> SHA256 -> ReaderT SqlBackend (RIO env) (Maybe TreeKey) @@ -827,8 +857,7 @@ loadHackageTree rpli name ver bid = do Just tid -> Just <$> loadPackageById rpli tid storeArchiveCache - :: (HasPantryConfig env, HasLogFunc env) - => Text -- ^ URL + :: Text -- ^ URL -> Text -- ^ subdir -> SHA256 -> FileSize @@ -847,8 +876,7 @@ storeArchiveCache url subdir sha size treeKey' = do } loadArchiveCache - :: (HasPantryConfig env, HasLogFunc env) - => Text -- ^ URL + :: Text -- ^ URL -> Text -- ^ subdir -> ReaderT SqlBackend (RIO env) [(SHA256, FileSize, TreeId)] loadArchiveCache url subdir = map go <$> selectList @@ -860,8 +888,7 @@ loadArchiveCache url subdir = map go <$> selectList go (Entity _ ac) = (archiveCacheSha ac, archiveCacheSize ac, archiveCacheTree ac) storeRepoCache - :: (HasPantryConfig env, HasLogFunc env) - => Repo + :: Repo -> Text -- ^ subdir -> TreeId -> ReaderT SqlBackend (RIO env) () @@ -877,8 +904,7 @@ storeRepoCache repo subdir tid = do } loadRepoCache - :: (HasPantryConfig env, HasLogFunc env) - => Repo + :: Repo -> Text -- ^ subdir -> ReaderT SqlBackend (RIO env) (Maybe TreeId) loadRepoCache repo subdir = fmap (repoCacheTree . entityVal) <$> selectFirst @@ -889,11 +915,8 @@ loadRepoCache repo subdir = fmap (repoCacheTree . entityVal) <$> selectFirst ] [Desc RepoCacheTime] -storePreferredVersion - :: (HasPantryConfig env, HasLogFunc env) - => P.PackageName - -> Text - -> ReaderT SqlBackend (RIO env) () +storePreferredVersion :: + P.PackageName -> Text -> ReaderT SqlBackend (RIO env) () storePreferredVersion name p = do nameid <- getPackageNameId name ment <- getBy $ UniquePreferred nameid @@ -904,17 +927,14 @@ storePreferredVersion name p = do } Just (Entity pid _) -> update pid [PreferredVersionsPreferred =. p] -loadPreferredVersion - :: (HasPantryConfig env, HasLogFunc env) - => P.PackageName - -> ReaderT SqlBackend (RIO env) (Maybe Text) +loadPreferredVersion :: + P.PackageName -> ReaderT SqlBackend (RIO env) (Maybe Text) loadPreferredVersion name = do nameid <- getPackageNameId name fmap (preferredVersionsPreferred . entityVal) <$> getBy (UniquePreferred nameid) sinkHackagePackageNames - :: (HasPantryConfig env, HasLogFunc env) - => (P.PackageName -> Bool) + :: (P.PackageName -> Bool) -> ConduitT P.PackageName Void (ReaderT SqlBackend (RIO env)) a -> ReaderT SqlBackend (RIO env) a sinkHackagePackageNames predicate sink = do @@ -1018,9 +1038,7 @@ unpackTreeToDir rpli (toFilePath -> dir) (P.TreeMap m) = do perms <- getPermissions dest setPermissions dest $ setOwnerExecutable True perms -countHackageCabals - :: (HasPantryConfig env, HasLogFunc env) - => ReaderT SqlBackend (RIO env) Int +countHackageCabals :: ReaderT SqlBackend (RIO env) Int countHackageCabals = do res <- rawSql "SELECT COUNT(*)\n\ @@ -1032,29 +1050,25 @@ countHackageCabals = do pure n getSnapshotCacheByHash - :: (HasPantryConfig env, HasLogFunc env) - => SnapshotCacheHash + :: SnapshotCacheHash -> ReaderT SqlBackend (RIO env) (Maybe SnapshotCacheId) getSnapshotCacheByHash = fmap (fmap entityKey) . getBy . UniqueSnapshotCache . unSnapshotCacheHash getSnapshotCacheId - :: (HasPantryConfig env, HasLogFunc env) - => SnapshotCacheHash + :: SnapshotCacheHash -> ReaderT SqlBackend (RIO env) SnapshotCacheId getSnapshotCacheId = fmap (either entityKey id) . insertBy . SnapshotCache . unSnapshotCacheHash getModuleNameId - :: (HasPantryConfig env, HasLogFunc env) - => P.ModuleName + :: P.ModuleName -> ReaderT SqlBackend (RIO env) ModuleNameId getModuleNameId = fmap (either entityKey id) . insertBy . ModuleName . P.ModuleNameP storeSnapshotModuleCache - :: (HasPantryConfig env, HasLogFunc env) - => SnapshotCacheId + :: SnapshotCacheId -> Map P.PackageName (Set P.ModuleName) -> ReaderT SqlBackend (RIO env) () storeSnapshotModuleCache cache packageModules = @@ -1069,8 +1083,7 @@ storeSnapshotModuleCache cache packageModules = } loadExposedModulePackages - :: (HasPantryConfig env, HasLogFunc env) - => SnapshotCacheId + :: SnapshotCacheId -> P.ModuleName -> ReaderT SqlBackend (RIO env) [P.PackageName] loadExposedModulePackages cacheId mName = diff --git a/subs/pantry/src/Pantry/Tree.hs b/subs/pantry/src/Pantry/Tree.hs index 63625280bc..63f2e25808 100644 --- a/subs/pantry/src/Pantry/Tree.hs +++ b/subs/pantry/src/Pantry/Tree.hs @@ -10,7 +10,7 @@ import RIO import qualified RIO.Map as Map import qualified RIO.Text as T import qualified RIO.ByteString as B -import Pantry.Storage +import Pantry.Storage hiding (Tree, TreeEntry) import Pantry.Types import RIO.FilePath ((), takeDirectory) import RIO.Directory (createDirectoryIfMissing, setPermissions, getPermissions, setOwnerExecutable) diff --git a/subs/pantry/src/Pantry/Types.hs b/subs/pantry/src/Pantry/Types.hs index 44a6318d41..a13fb8d225 100644 --- a/subs/pantry/src/Pantry/Types.hs +++ b/subs/pantry/src/Pantry/Types.hs @@ -10,7 +10,6 @@ {-# LANGUAGE RecordWildCards #-} {-# LANGUAGE ScopedTypeVariables #-} {-# LANGUAGE TupleSections #-} -{-# LANGUAGE StandaloneDeriving #-} {-# LANGUAGE MultiWayIf #-} module Pantry.Types ( PantryConfig (..) @@ -91,6 +90,7 @@ module Pantry.Types , RawSnapshotLocation (..) , SnapshotLocation (..) , toRawSL + , parseHackageText , parseRawSnapshotLocation , RawSnapshotLayer (..) , SnapshotLayer (..) @@ -105,6 +105,9 @@ module Pantry.Types , toRawPM , cabalFileName , SnapshotCacheHash (..) + , getGlobalHintsFile + , bsToBlobKey + , pliIdent ) where import RIO @@ -114,7 +117,7 @@ import qualified RIO.ByteString as B import qualified RIO.ByteString.Lazy as BL import RIO.Char (isSpace) import RIO.List (intersperse) -import RIO.Time (toGregorian, Day, fromGregorianValid) +import RIO.Time (toGregorian, Day, fromGregorianValid, UTCTime) import qualified RIO.Map as Map import qualified RIO.HashMap as HM import qualified Data.Map.Strict as Map (mapKeysMonotonic) @@ -122,6 +125,7 @@ import qualified RIO.Set as Set import Data.Aeson (ToJSON (..), FromJSON (..), withText, FromJSONKey (..)) import Data.Aeson.Types (ToJSONKey (..) ,toJSONKeyText, Parser) import Data.Aeson.Extended +import Data.Aeson.Encoding.Internal (unsafeToEncoding) import Data.ByteString.Builder (toLazyByteString, byteString, wordDec) import Database.Persist import Database.Persist.Sql @@ -130,7 +134,7 @@ import qualified Pantry.SHA256 as SHA256 import qualified Distribution.Compat.ReadP as Parse import Distribution.CabalSpecVersion (CabalSpecVersion (..), cabalSpecLatest) import Distribution.Parsec.Common (PError (..), PWarning (..), showPos) -import Distribution.Types.PackageName (PackageName, unPackageName) +import Distribution.Types.PackageName (PackageName, unPackageName, mkPackageName) import Distribution.Types.VersionRange (VersionRange) import Distribution.PackageDescription (FlagName, unFlagName, GenericPackageDescription) import Distribution.Types.PackageId (PackageIdentifier (..)) @@ -205,8 +209,8 @@ newtype Revision = Revision Word -- whether a pool is used, and the default implementation in -- "Pantry.Storage" does not use a pool. data Storage = Storage - { withStorage_ :: (forall env a. HasLogFunc env => ReaderT SqlBackend (RIO env) a -> RIO env a) - , withWriteLock_ :: (forall env a. HasLogFunc env => RIO env a -> RIO env a) + { withStorage_ :: forall env a. HasLogFunc env => ReaderT SqlBackend (RIO env) a -> RIO env a + , withWriteLock_ :: forall env a. HasLogFunc env => RIO env a -> RIO env a } -- | Configuration value used by the entire pantry package. Create one @@ -483,9 +487,9 @@ data Repo = Repo -- -- @since 0.1.0.0 , repoSubdir :: !Text - -- ^ Subdirectory within the archive to get the package from. - -- - -- @since 0.1.0.0 + -- ^ Subdirectory within the archive to get the package from. + -- + -- @since 0.1.0.0 } deriving (Generic, Eq, Ord, Typeable) instance NFData Repo @@ -504,6 +508,7 @@ instance Display Repo where then mempty else " in subdirectory " <> display subdir) + -- An unexported newtype wrapper to hang a 'FromJSON' instance off of. Contains -- a GitHub user and repo name separated by a forward slash, e.g. "foo/bar". newtype GitHubRepo = GitHubRepo Text @@ -540,6 +545,7 @@ instance FromJSON (WithJSONWarnings HackageSecurityConfig) where hscIgnoreExpiry <- o ..:? "ignore-expiry" ..!= False pure HackageSecurityConfig {..} + -- | An environment which contains a 'PantryConfig'. -- -- @since 0.1.0.0 @@ -549,6 +555,7 @@ class HasPantryConfig env where -- @since 0.1.0.0 pantryConfigL :: Lens' env PantryConfig + -- | File size in bytes -- -- @since 0.1.0.0 @@ -587,7 +594,9 @@ instance FromJSON BlobKey where <*> o .: "size" newtype PackageNameP = PackageNameP { unPackageNameP :: PackageName } - deriving (Show) + deriving (Eq, Ord, Show, Read, NFData) +instance Display PackageNameP where + display = fromString . packageNameString . unPackageNameP instance PersistField PackageNameP where toPersistValue (PackageNameP pn) = PersistText $ T.pack $ packageNameString pn fromPersistValue v = do @@ -597,9 +606,20 @@ instance PersistField PackageNameP where Just pn -> Right $ PackageNameP pn instance PersistFieldSql PackageNameP where sqlType _ = SqlString - -newtype VersionP = VersionP Version - deriving (Show) +instance ToJSON PackageNameP where + toJSON (PackageNameP pn) = String $ T.pack $ packageNameString pn +instance FromJSON PackageNameP where + parseJSON = withText "PackageNameP" $ pure . PackageNameP . mkPackageName . T.unpack +instance ToJSONKey PackageNameP where + toJSONKey = + ToJSONKeyText + (T.pack . packageNameString . unPackageNameP) + (unsafeToEncoding . getUtf8Builder . display) +instance FromJSONKey PackageNameP where + fromJSONKey = FromJSONKeyText $ PackageNameP . mkPackageName . T.unpack + +newtype VersionP = VersionP { unVersionP :: Version } + deriving (Eq, Ord, Show, Read, NFData) instance PersistField VersionP where toPersistValue (VersionP v) = PersistText $ T.pack $ versionString v fromPersistValue v = do @@ -609,9 +629,20 @@ instance PersistField VersionP where Just ver -> Right $ VersionP ver instance PersistFieldSql VersionP where sqlType _ = SqlString - -newtype ModuleNameP = ModuleNameP ModuleName - deriving (Show) +instance Display VersionP where + display (VersionP v) = fromString $ versionString v +instance ToJSON VersionP where + toJSON (VersionP v) = String $ T.pack $ versionString v +instance FromJSON VersionP where + parseJSON = + withText "VersionP" $ + either (fail . displayException) (pure . VersionP) . parseVersionThrowing . T.unpack + +newtype ModuleNameP = ModuleNameP + { unModuleNameP :: ModuleName + } deriving (Eq, Ord, Show, NFData) +instance Display ModuleNameP where + display = fromString . moduleNameString . unModuleNameP instance PersistField ModuleNameP where toPersistValue (ModuleNameP mn) = PersistText $ T.pack $ moduleNameString mn fromPersistValue v = do @@ -683,6 +714,32 @@ instance FromJSON PackageIdentifierRevision where Left e -> fail $ show e Right pir -> pure pir +-- | Parse a hackage text. +parseHackageText :: Text -> Either PantryException (PackageIdentifier, BlobKey) +parseHackageText t = maybe (Left $ PackageIdentifierRevisionParseFail t) Right $ do + let (identT, cfiT) = T.break (== '@') t + PackageIdentifier name version <- parsePackageIdentifier $ T.unpack identT + (csha, csize) <- + case splitColon cfiT of + Just ("@sha256", shaSizeT) -> do + let (shaT, sizeT) = T.break (== ',') shaSizeT + sha <- either (const Nothing) Just $ SHA256.fromHexText shaT + msize <- + case T.stripPrefix "," sizeT of + Nothing -> Nothing + Just sizeT' -> + case decimal sizeT' of + Right (size', "") -> Just $ (sha, FileSize size') + _ -> Nothing + pure msize + _ -> Nothing + pure $ (PackageIdentifier name version, BlobKey csha csize) + +splitColon :: Text -> Maybe (Text, Text) +splitColon t' = + let (x, y) = T.break (== ':') t' + in (x, ) <$> T.stripPrefix ":" y + -- | Parse a 'PackageIdentifierRevision' -- -- @since 0.1.0.0 @@ -710,10 +767,6 @@ parsePackageIdentifierRevision t = maybe (Left $ PackageIdentifierRevisionParseF Nothing -> pure CFILatest _ -> Nothing pure $ PackageIdentifierRevision name version cfi - where - splitColon t' = - let (x, y) = T.break (== ':') t' - in (x, ) <$> T.stripPrefix ":" y data Mismatch a = Mismatch { mismatchExpected :: !a @@ -1350,6 +1403,17 @@ instance Display PackageMetadata where , "cabal file == " <> display (pmCabal pm) ] +parsePackageMetadata :: Object -> WarningParser PackageMetadata +parsePackageMetadata o = do + pmCabal :: BlobKey <- o ..: "cabal-file" + pantryTree :: BlobKey <- o ..: "pantry-tree" + CabalString pkgName <- o ..: "name" + CabalString pkgVersion <- o ..: "version" + let pmTreeKey = TreeKey pantryTree + pmIdent = PackageIdentifier {..} + pure PackageMetadata {..} + + -- | Conver package metadata to its "raw" equivalent. -- -- @since 0.1.0.0 @@ -1462,6 +1526,57 @@ rpmToPairs (RawPackageMetadata mname mversion mtree mcabal) = concat , maybe [] (\cabal -> ["cabal-file" .= cabal]) mcabal ] +instance FromJSON (WithJSONWarnings (Unresolved PackageLocationImmutable)) where + parseJSON v = repoObject v <|> archiveObject v <|> hackageObject v <|> github v + <|> fail ("Could not parse a UnresolvedPackageLocationImmutable from: " ++ show v) + where + repoObject :: Value -> Parser (WithJSONWarnings (Unresolved PackageLocationImmutable)) + repoObject = withObjectWarnings "UnresolvedPackageLocationImmutable.PLIRepo" $ \o -> do + pm <- parsePackageMetadata o + repoSubdir <- o ..:? "subdir" ..!= "" + repoCommit <- o ..: "commit" + (repoType, repoUrl) <- + (o ..: "git" >>= \url -> pure (RepoGit, url)) <|> + (o ..: "hg" >>= \url -> pure (RepoHg, url)) + pure $ pure $ PLIRepo Repo {..} pm + + archiveObject = + withObjectWarnings "UnresolvedPackageLocationImmutable.PLIArchive" $ \o -> do + pm <- parsePackageMetadata o + Unresolved mkArchiveLocation <- parseArchiveLocationObject o + archiveHash <- o ..: "sha256" + archiveSize <- o ..: "size" + archiveSubdir <- o ..:? "subdir" ..!= "" + pure $ Unresolved $ \mdir -> do + archiveLocation <- mkArchiveLocation mdir + pure $ PLIArchive Archive {..} pm + + hackageObject = + withObjectWarnings "UnresolvedPackagelocationimmutable.PLIHackage (Object)" $ \o -> do + treeKey <- o ..: "pantry-tree" + htxt <- o ..: "hackage" + case parseHackageText htxt of + Left e -> fail $ show e + Right (pkgIdentifier, blobKey) -> + pure $ pure $ PLIHackage pkgIdentifier blobKey (TreeKey treeKey) + + github value = + withObjectWarnings "UnresolvedPackagelocationimmutable.PLIArchive:github" (\o -> do + pm <- parsePackageMetadata o + GitHubRepo ghRepo <- o ..: "github" + commit <- o ..: "commit" + let archiveLocation = ALUrl $ T.concat + [ "https://github.com/" + , ghRepo + , "/archive/" + , commit + , ".tar.gz" + ] + archiveHash <- o ..: "sha256" + archiveSize <- o ..: "size" + archiveSubdir <- o ..:? "subdir" ..!= "" + pure $ pure $ PLIArchive Archive {..} pm) value + instance FromJSON (WithJSONWarnings (Unresolved (NonEmpty RawPackageLocationImmutable))) where parseJSON v = http v @@ -1470,10 +1585,10 @@ instance FromJSON (WithJSONWarnings (Unresolved (NonEmpty RawPackageLocationImmu <|> repo v <|> archiveObject v <|> github v - <|> fail ("Could not parse a UnresolvedPackageLocationImmutable from: " ++ show v) + <|> fail ("Could not parse a UnresolvedRawPackageLocationImmutable from: " ++ show v) where http :: Value -> Parser (WithJSONWarnings (Unresolved (NonEmpty RawPackageLocationImmutable))) - http = withText "UnresolvedPackageLocationImmutable.UPLIArchive (Text)" $ \t -> + http = withText "UnresolvedPackageLocationImmutable.RPLIArchive (Text)" $ \t -> case parseArchiveLocationText t of Nothing -> fail $ "Invalid archive location: " ++ T.unpack t Just (Unresolved mkArchiveLocation) -> @@ -1519,7 +1634,7 @@ instance FromJSON (WithJSONWarnings (Unresolved (NonEmpty RawPackageLocationImmu os <- optionalSubdirs o pure $ pure $ NE.map (\(repoSubdir, pm) -> RPLIRepo Repo {..} pm) (osToRpms os) - archiveObject = withObjectWarnings "UnresolvedPackageLocationImmutable.UPLIArchive" $ \o -> do + archiveObject = withObjectWarnings "UnresolvedPackageLocationImmutable.RPLIArchive" $ \o -> do Unresolved mkArchiveLocation <- parseArchiveLocationObject o raHash <- o ..:? "sha256" raSize <- o ..:? "size" @@ -1624,6 +1739,7 @@ data HpackExecutable -- ^ Executable at the provided path deriving (Show, Read, Eq, Ord) + -- | Which compiler a snapshot wants to use. The build tool may elect -- to do some fuzzy matching of versions (e.g., allowing different -- patch versions). @@ -1637,6 +1753,7 @@ data WantedCompiler !Version -- ^ GHCJS version followed by GHC version deriving (Show, Eq, Ord, Generic) + instance NFData WantedCompiler instance Display WantedCompiler where display (WCGhc vghc) = "ghc-" <> fromString (versionString vghc) @@ -1881,6 +1998,27 @@ instance NFData SnapshotLocation instance ToJSON SnapshotLocation where toJSON sl = toJSON (toRawSL sl) +instance FromJSON (WithJSONWarnings (Unresolved SnapshotLocation)) where + parseJSON v = file v <|> url v <|> compiler v + where + file = withObjectWarnings "SLFilepath" $ \o -> do + ufp <- o ..: "filepath" + pure $ Unresolved $ \mdir -> + case mdir of + Nothing -> throwIO $ InvalidFilePathSnapshot ufp + Just dir -> do + absolute <- resolveFile dir (T.unpack ufp) + let fp = ResolvedPath (RelFilePath ufp) absolute + pure $ SLFilePath fp + url = withObjectWarnings "SLUrl" $ \o -> do + url' <- o ..: "url" + sha <- o ..: "sha256" + size <- o ..: "size" + pure $ Unresolved $ \_ -> pure $ SLUrl url' (BlobKey sha size) + compiler = withObjectWarnings "SLCompiler" $ \o -> do + c <- o ..: "compiler" + pure $ Unresolved $ \_ -> pure $ SLCompiler c + -- | Convert snapshot location to its "raw" equivalent. -- -- @since 0.1.0.0 @@ -1984,6 +2122,10 @@ data RawSnapshotLayer = RawSnapshotLayer -- ^ GHC options per package -- -- @since 0.1.0.0 + , rslPublishTime :: !(Maybe UTCTime) + -- ^ See 'slPublishTime' + -- + -- @since 0.1.0.0 } deriving (Show, Eq, Generic) @@ -2006,6 +2148,7 @@ instance ToJSON RawSnapshotLayer where , if Map.null (rslGhcOptions rsnap) then [] else ["ghc-options" .= toCabalStringMap (rslGhcOptions rsnap)] + , maybe [] (\time -> ["publish-time" .= time]) (rslPublishTime rsnap) ] instance FromJSON (WithJSONWarnings (Unresolved RawSnapshotLayer)) where @@ -2028,6 +2171,7 @@ instance FromJSON (WithJSONWarnings (Unresolved RawSnapshotLayer)) where rslFlags <- (unCabalStringMap . fmap unCabalStringMap) <$> (o ..:? "flags" ..!= Map.empty) rslHidden <- unCabalStringMap <$> (o ..:? "hidden" ..!= Map.empty) rslGhcOptions <- unCabalStringMap <$> (o ..:? "ghc-options" ..!= Map.empty) + rslPublishTime <- o ..:? "publish-time" pure $ (\rslLocations (rslParent, rslCompiler) -> RawSnapshotLayer {..}) <$> ((concat . map NE.toList) <$> sequenceA unresolvedLocs) <*> unresolvedSnapshotParent @@ -2072,18 +2216,24 @@ data SnapshotLayer = SnapshotLayer -- ^ GHC options per package -- -- @since 0.1.0.0 + , slPublishTime :: !(Maybe UTCTime) + -- ^ Publication timestamp for this snapshot. This field is optional, and + -- is for informational purposes only. + -- + -- @since 0.1.0.0 } deriving (Show, Eq, Generic) instance ToJSON SnapshotLayer where toJSON snap = object $ concat [ ["resolver" .= slParent snap] - , ["compiler" .= slCompiler snap] + , maybe [] (\compiler -> ["compiler" .= compiler]) (slCompiler snap) , ["packages" .= slLocations snap] , if Set.null (slDropPackages snap) then [] else ["drop-packages" .= Set.map CabalString (slDropPackages snap)] , if Map.null (slFlags snap) then [] else ["flags" .= fmap toCabalStringMap (toCabalStringMap (slFlags snap))] , if Map.null (slHidden snap) then [] else ["hidden" .= toCabalStringMap (slHidden snap)] , if Map.null (slGhcOptions snap) then [] else ["ghc-options" .= toCabalStringMap (slGhcOptions snap)] + , maybe [] (\time -> ["publish-time" .= time]) (slPublishTime snap) ] -- | Convert snapshot layer into its "raw" equivalent. @@ -2098,7 +2248,28 @@ toRawSnapshotLayer sl = RawSnapshotLayer , rslFlags = slFlags sl , rslHidden = slHidden sl , rslGhcOptions = slGhcOptions sl + , rslPublishTime = slPublishTime sl } newtype SnapshotCacheHash = SnapshotCacheHash { unSnapshotCacheHash :: SHA256} deriving (Show) + +-- | Get the path to the global hints cache file +getGlobalHintsFile :: HasPantryConfig env => RIO env (Path Abs File) +getGlobalHintsFile = do + root <- view $ pantryConfigL.to pcRootDir + globalHintsRelFile <- parseRelFile "global-hints-cache.yaml" + pure $ root globalHintsRelFile + +-- | Creates BlobKey for an input ByteString +-- +-- @since 0.1.0.0 +bsToBlobKey :: ByteString -> BlobKey +bsToBlobKey bs = + BlobKey (SHA256.hashBytes bs) (FileSize (fromIntegral (B.length bs))) + +-- | Identifier from a 'PackageLocationImmutable' +pliIdent :: PackageLocationImmutable -> PackageIdentifier +pliIdent (PLIHackage ident _ _) = ident +pliIdent (PLIArchive _ pm) = pmIdent pm +pliIdent (PLIRepo _ pm) = pmIdent pm diff --git a/subs/pantry/test/Pantry/ArchiveSpec.hs b/subs/pantry/test/Pantry/ArchiveSpec.hs index 2ba99d4479..86fd49e3ef 100644 --- a/subs/pantry/test/Pantry/ArchiveSpec.hs +++ b/subs/pantry/test/Pantry/ArchiveSpec.hs @@ -96,3 +96,9 @@ spec = do , testSubdir = "subs/pant" } `shouldThrow` treeWithoutCabalFile + it "follows symlinks to directories" $ do + ident <- getRawPackageLocationIdent' TestArchive + { testLocation = TLFilePath "attic/symlink-to-dir.tar.gz" + , testSubdir = "symlink" + } + ident `shouldBe` parsePackageIdentifier' "foo-1.2.3" diff --git a/src/test/Stack/SourceMapSpec.hs b/subs/pantry/test/Pantry/GlobalHintsSpec.hs similarity index 53% rename from src/test/Stack/SourceMapSpec.hs rename to subs/pantry/test/Pantry/GlobalHintsSpec.hs index aa416049b7..8d01bf0a7e 100644 --- a/src/test/Stack/SourceMapSpec.hs +++ b/subs/pantry/test/Pantry/GlobalHintsSpec.hs @@ -1,33 +1,28 @@ {-# LANGUAGE FlexibleContexts #-} +{-# LANGUAGE NoImplicitPrelude #-} {-# LANGUAGE OverloadedStrings #-} -module Stack.SourceMapSpec (spec) where +module Pantry.GlobalHintsSpec (spec) where import Distribution.Types.PackageName (mkPackageName) import Distribution.Version (mkVersion) -import Stack.Options.GlobalParser (globalOptsFromMonoid) -import Stack.Prelude -import Stack.Runners -import Stack.SourceMap (loadGlobalHints) -import Stack.Types.Config (globalLogLevel) +import RIO +import Pantry (loadGlobalHints, WantedCompiler (..), runPantryAppClean) +import Pantry.Internal import Test.Hspec import qualified RIO.Map as Map -import RIO.ByteString (hPut) -import Path.IO (resolveFile') +import Path (toFilePath) spec :: Spec spec = do - describe "loadGlobalHints" $ do - let it' name inner = it name $ withSystemTempFile "global-hints.yaml" $ \fp h -> do - hPut h "this should be ignored" - hClose h :: IO () - abs' <- resolveFile' fp - globalOpts <- globalOptsFromMonoid False mempty - withRunnerGlobal globalOpts { globalLogLevel = LevelOther "silent" } $ inner abs' - it' "unknown compiler" $ \fp -> do - mmap <- loadGlobalHints fp $ WCGhc (mkVersion [0, 0, 0, 0, 0, 0, 0]) + let it' name inner = it name $ example $ runPantryAppClean $ do + file <- getGlobalHintsFile + writeFileBinary (toFilePath file) "this should be ignored" + inner + it' "unknown compiler" $ do + mmap <- loadGlobalHints $ WCGhc (mkVersion [0, 0, 0, 0, 0, 0, 0]) liftIO $ mmap `shouldBe` Nothing - it' "known compiler" $ \fp -> do - mmap <- loadGlobalHints fp $ WCGhc (mkVersion [8, 4, 3]) + it' "known compiler" $ do + mmap <- loadGlobalHints $ WCGhc (mkVersion [8, 4, 3]) case mmap of Nothing -> error "not found" Just m -> liftIO $ do @@ -35,8 +30,8 @@ spec = do Map.lookup (mkPackageName "base") m `shouldBe` Just (mkVersion [4, 11, 1, 0]) Map.lookup (mkPackageName "bytestring") m `shouldBe` Just (mkVersion [0, 10, 8, 2]) Map.lookup (mkPackageName "acme-missiles") m `shouldBe` Nothing - it' "older known compiler" $ \fp -> do - mmap <- loadGlobalHints fp $ WCGhc (mkVersion [7, 8, 4]) + it' "older known compiler" $ do + mmap <- loadGlobalHints $ WCGhc (mkVersion [7, 8, 4]) case mmap of Nothing -> error "not found" Just m -> liftIO $ do diff --git a/subs/pantry/test/Pantry/InternalSpec.hs b/subs/pantry/test/Pantry/InternalSpec.hs index 9b7dcaee46..689f441567 100644 --- a/subs/pantry/test/Pantry/InternalSpec.hs +++ b/subs/pantry/test/Pantry/InternalSpec.hs @@ -17,15 +17,17 @@ spec = do "file/\\test" ! Just "file/\\test" "/file/////\\test" ! Nothing "file/////\\test" ! Just "file/\\test" + "file/test/" ! Just "file/test" "/file/\\test////" ! Nothing "/file/./test" ! Nothing "file/./test" ! Just "file/test" "/test/file/../bob/fred/" ! Nothing "/test/file/../bob/fred" ! Nothing - "test/file/../bob/fred/" ! Nothing + "test/file/../bob/fred/" ! Just "test/bob/fred" "test/file/../bob/fred" ! Just "test/bob/fred" + "../bob/fred" ! Nothing "../bob/fred/" ! Nothing - "./bob/fred/" ! Nothing + "./bob/fred/" ! Just "bob/fred" "./bob/fred" ! Just "bob/fred" "./" ! Nothing "./." ! Nothing diff --git a/subs/pantry/test/Pantry/TypesSpec.hs b/subs/pantry/test/Pantry/TypesSpec.hs index c45898ed26..70fcba5865 100644 --- a/subs/pantry/test/Pantry/TypesSpec.hs +++ b/subs/pantry/test/Pantry/TypesSpec.hs @@ -1,22 +1,36 @@ {-# LANGUAGE NoImplicitPrelude #-} +{-# LANGUAGE ScopedTypeVariables #-} {-# LANGUAGE OverloadedStrings #-} +{-# LANGUAGE QuasiQuotes #-} {-# LANGUAGE RecordWildCards #-} {-# LANGUAGE FlexibleContexts #-} -module Pantry.TypesSpec (spec) where +{-# LANGUAGE FlexibleInstances #-} -import Test.Hspec +module Pantry.TypesSpec + ( spec + ) where + +import Data.Aeson.Extended +import qualified Data.ByteString.Char8 as S8 +import qualified Data.Yaml as Yaml +import Distribution.Types.PackageName (mkPackageName) +import Distribution.Types.Version (mkVersion) import Hedgehog import qualified Hedgehog.Gen as Gen import qualified Hedgehog.Range as Range import Pantry +import Pantry.Internal + ( Tree(..) + , TreeEntry(..) + , mkSafeFilePath + , parseTree + , renderTree + ) import qualified Pantry.SHA256 as SHA256 -import Pantry.Internal (parseTree, renderTree, Tree (..), TreeEntry (..), mkSafeFilePath) import RIO -import Distribution.Types.Version (mkVersion) import qualified RIO.Text as T -import qualified Data.Yaml as Yaml -import Data.Aeson.Extended (WithJSONWarnings (..)) -import qualified Data.ByteString.Char8 as S8 +import Test.Hspec +import Text.RawString.QQ import RIO.Time (Day (..)) hh :: HasCallStack => String -> Property -> Spec @@ -30,6 +44,37 @@ genBlobKey = BlobKey <$> genSha256 <*> (FileSize <$> (Gen.word (Range.linear 1 1 genSha256 :: Gen SHA256 genSha256 = SHA256.hashBytes <$> Gen.bytes (Range.linear 1 500) +samplePLIRepo :: ByteString +samplePLIRepo = + [r| +subdir: wai +cabal-file: + size: 1765 + sha256: eea52c4967d8609c2f79213d6dffe6d6601034f1471776208404781de7051410 +name: wai +version: 3.2.1.2 +git: https://github.com/yesodweb/wai.git +pantry-tree: + size: 714 + sha256: ecfd0b4b75f435a3f362394807b35e5ef0647b1a25005d44a3632c49db4833d2 +commit: d11d63f1a6a92db8c637a8d33e7953ce6194a3e0 +|] + +samplePLIRepo2 :: ByteString +samplePLIRepo2 = + [r| +cabal-file: + size: 1863 + sha256: 5ebffc39e75ea1016adcc8426dc31d2040d2cc8a5f4bbce228592ef35e233da2 +name: merkle-log +version: 0.1.0.0 +git: https://github.com/kadena-io/merkle-log.git +pantry-tree: + size: 615 + sha256: 5a99e5e41ccd675a7721a733714ba2096f4204d9010f867c5fb7095b78e2959d +commit: a7ae61d7082afe3aa1a0fd0546fc1351a2f7c376 +|] + spec :: Spec spec = do describe "WantedCompiler" $ do @@ -110,3 +155,62 @@ spec = do liftIO $ Yaml.toJSON (nightlySnapshotLocation day) `shouldBe` Yaml.String (T.pack $ "nightly-" ++ show day) + it "FromJSON instance for PLIRepo" $ do + WithJSONWarnings unresolvedPli warnings <- Yaml.decodeThrow samplePLIRepo + warnings `shouldBe` [] + pli <- resolvePaths Nothing unresolvedPli + let repoValue = + Repo + { repoSubdir = "wai" + , repoType = RepoGit + , repoCommit = + "d11d63f1a6a92db8c637a8d33e7953ce6194a3e0" + , repoUrl = "https://github.com/yesodweb/wai.git" + } + cabalSha = + SHA256.fromHexBytes + "eea52c4967d8609c2f79213d6dffe6d6601034f1471776208404781de7051410" + pantrySha = + SHA256.fromHexBytes + "ecfd0b4b75f435a3f362394807b35e5ef0647b1a25005d44a3632c49db4833d2" + (csha, psha) <- case (cabalSha, pantrySha) of + (Right csha, Right psha) -> pure (csha, psha) + _ -> fail "Failed decoding sha256" + let pkgValue = + PackageMetadata + { pmIdent = + PackageIdentifier + (mkPackageName "wai") + (mkVersion [3, 2, 1, 2]) + , pmTreeKey = TreeKey (BlobKey psha (FileSize 714)) + , pmCabal = BlobKey csha (FileSize 1765) + } + pli `shouldBe` PLIRepo repoValue pkgValue + + WithJSONWarnings reparsed warnings2 <- Yaml.decodeThrow $ Yaml.encode pli + warnings2 `shouldBe` [] + reparsed' <- resolvePaths Nothing reparsed + reparsed' `shouldBe` pli + it "parseHackageText parses" $ do + let txt = + "persistent-2.8.2@sha256:df118e99f0c46715e932fe82d787fc09689d87898f3a8b13f5954d25af6b46a1,5058" + hsha = + SHA256.fromHexBytes + "df118e99f0c46715e932fe82d787fc09689d87898f3a8b13f5954d25af6b46a1" + sha <- case hsha of + Right sha' -> pure sha' + _ -> fail "parseHackagetext: failed decoding the sha256" + let Right (pkgIdentifier, blobKey) = parseHackageText txt + blobKey `shouldBe` (BlobKey sha (FileSize 5058)) + pkgIdentifier `shouldBe` + PackageIdentifier + (mkPackageName "persistent") + (mkVersion [2, 8, 2]) + it "roundtripping a PLIRepo" $ do + WithJSONWarnings unresolvedPli warnings <- Yaml.decodeThrow samplePLIRepo2 + warnings `shouldBe` [] + pli <- resolvePaths Nothing unresolvedPli + WithJSONWarnings unresolvedPli2 warnings2 <- Yaml.decodeThrow $ Yaml.encode pli + warnings2 `shouldBe` [] + pli2 <- resolvePaths Nothing unresolvedPli2 + pli2 `shouldBe` (pli :: PackageLocationImmutable) diff --git a/test/files/iface/x32/Main.hs b/test/files/iface/x32/Main.hs new file mode 100644 index 0000000000..6fd36ba675 --- /dev/null +++ b/test/files/iface/x32/Main.hs @@ -0,0 +1,7 @@ +{-# LANGUAGE TemplateHaskell #-} +import Language.Haskell.TH.Syntax + +main :: IO () +main = $(do + qAddDependentFile "some-dependency.txt" + [|pure ()|]) diff --git a/test/files/iface/x32/ghc7103/Main.hi b/test/files/iface/x32/ghc7103/Main.hi new file mode 100644 index 0000000000..58f3c54d70 Binary files /dev/null and b/test/files/iface/x32/ghc7103/Main.hi differ diff --git a/test/files/iface/x32/ghc802/Main.hi b/test/files/iface/x32/ghc802/Main.hi new file mode 100644 index 0000000000..bbfae7e387 Binary files /dev/null and b/test/files/iface/x32/ghc802/Main.hi differ diff --git a/test/files/iface/x32/ghc822/Main.hi b/test/files/iface/x32/ghc822/Main.hi new file mode 100644 index 0000000000..c3c1ae8ad1 Binary files /dev/null and b/test/files/iface/x32/ghc822/Main.hi differ diff --git a/test/files/iface/x32/ghc844/Main.hi b/test/files/iface/x32/ghc844/Main.hi new file mode 100644 index 0000000000..19b0f70fcc Binary files /dev/null and b/test/files/iface/x32/ghc844/Main.hi differ diff --git a/test/files/iface/x32/run.sh b/test/files/iface/x32/run.sh new file mode 100755 index 0000000000..0a4a74f6ba --- /dev/null +++ b/test/files/iface/x32/run.sh @@ -0,0 +1,16 @@ +#!/usr/bin/env bash + +set -eux + +go() { + for ver in 7.10.3 8.0.2 8.2.2 8.4.4 8.6.4 + do + stack --resolver ghc-$ver --arch i386 ghc -- -fforce-recomp Main.hs + local DIR + DIR=ghc"$(echo $ver | tr -d '.')" + mkdir -p DIR + mv Main.hi $DIR/Main.hi + done +} + +go diff --git a/test/files/iface/x64/Main.hs b/test/files/iface/x64/Main.hs new file mode 100644 index 0000000000..524ae0d0e4 --- /dev/null +++ b/test/files/iface/x64/Main.hs @@ -0,0 +1,18 @@ +{-# LANGUAGE CPP #-} +{-# LANGUAGE TemplateHaskell #-} + +module Main where + +import GHC.Types +import Language.Haskell.TH.Lib +import Language.Haskell.TH.Syntax +import X + +#include "Test.h" + +main :: IO () +main = putStrLn "Hello, World!" + +f :: String +f = $(let readme = "README.md" + in qAddDependentFile readme *> (stringE =<< qRunIO (readFile readme))) diff --git a/test/files/iface/x64/README.md b/test/files/iface/x64/README.md new file mode 100644 index 0000000000..d245656d61 --- /dev/null +++ b/test/files/iface/x64/README.md @@ -0,0 +1,3 @@ +# Generating the dummy iface + +Update the `supportedVersions` in the `shell.nix` and then run the following command `nix-shell --pure --run "generate"` diff --git a/test/files/iface/x64/Test.h b/test/files/iface/x64/Test.h new file mode 100644 index 0000000000..bb31ec3f0f --- /dev/null +++ b/test/files/iface/x64/Test.h @@ -0,0 +1,2 @@ +#define TRUE 1 +#define FALSE 0 diff --git a/test/files/iface/x64/X.hs b/test/files/iface/x64/X.hs new file mode 100644 index 0000000000..a1d5b7bc22 --- /dev/null +++ b/test/files/iface/x64/X.hs @@ -0,0 +1,4 @@ +module X where + +x :: Integer +x = 1 diff --git a/test/files/iface/x64/ghc822/Main.hi b/test/files/iface/x64/ghc822/Main.hi new file mode 100644 index 0000000000..32ebe107a8 Binary files /dev/null and b/test/files/iface/x64/ghc822/Main.hi differ diff --git a/test/files/iface/x64/ghc822/X.hi b/test/files/iface/x64/ghc822/X.hi new file mode 100644 index 0000000000..e934d1f3e5 Binary files /dev/null and b/test/files/iface/x64/ghc822/X.hi differ diff --git a/test/files/iface/x64/ghc844/Main.hi b/test/files/iface/x64/ghc844/Main.hi new file mode 100644 index 0000000000..19f78d0e50 Binary files /dev/null and b/test/files/iface/x64/ghc844/Main.hi differ diff --git a/test/files/iface/x64/ghc844/X.hi b/test/files/iface/x64/ghc844/X.hi new file mode 100644 index 0000000000..84c8be089c Binary files /dev/null and b/test/files/iface/x64/ghc844/X.hi differ diff --git a/test/files/iface/x64/ghc864/Main.hi b/test/files/iface/x64/ghc864/Main.hi new file mode 100644 index 0000000000..83fc501533 Binary files /dev/null and b/test/files/iface/x64/ghc864/Main.hi differ diff --git a/test/files/iface/x64/ghc864/X.hi b/test/files/iface/x64/ghc864/X.hi new file mode 100644 index 0000000000..2168134fa9 Binary files /dev/null and b/test/files/iface/x64/ghc864/X.hi differ diff --git a/test/files/iface/x64/shell.nix b/test/files/iface/x64/shell.nix new file mode 100644 index 0000000000..7691bf127b --- /dev/null +++ b/test/files/iface/x64/shell.nix @@ -0,0 +1,29 @@ +with (import (builtins.fetchTarball { + name = "nixpkgs-19.03"; + url = "https://github.com/nixos/nixpkgs/archive/release-19.03.tar.gz"; + sha256 = "sha256:1p0xkcz183gwga9f9b24ihq3b7syjimkhr31y6h044yfmrkcnb6d"; +}) {}); +let + supportedVersions = [ + "822" + "844" + "864" + ]; + generate = version: + let ghc = haskell.compiler."ghc${version}"; + main = "Main"; + in '' + mkdir -p ghc${version}/ + ${ghc}/bin/ghc -fforce-recomp -hidir ghc${version} ${main}.hs && \ + rm *.o && \ + rm ${main} + ''; +in + mkShell { + shellHook = + '' + generate() { + ${lib.concatMapStrings generate supportedVersions} + } + ''; + } diff --git a/test/integration/IntegrationSpec.hs b/test/integration/IntegrationSpec.hs index 786a3af8bf..c1ac7ed934 100644 --- a/test/integration/IntegrationSpec.hs +++ b/test/integration/IntegrationSpec.hs @@ -238,8 +238,15 @@ copyTree src dst = Just suffix <- return $ stripPrefix src srcfp let dstfp = dst stripHeadSeparator suffix createDirectoryIfMissing True $ takeDirectory dstfp - createSymbolicLink srcfp dstfp `catch` \(_ :: IOException) -> - copyFile srcfp dstfp -- for Windows + -- copying yaml files so lock files won't get created in + -- the source directory + if takeFileName srcfp /= "package.yaml" && + (takeExtensions srcfp == ".yaml" || takeExtensions srcfp == ".yml") + then + copyFile srcfp dstfp + else + createSymbolicLink srcfp dstfp `catch` \(_ :: IOException) -> + copyFile srcfp dstfp -- for Windows stripHeadSeparator :: FilePath -> FilePath stripHeadSeparator [] = [] diff --git a/test/integration/tests/1438-configure-options/Main.hs b/test/integration/tests/1438-configure-options/Main.hs new file mode 100644 index 0000000000..6d4762fec0 --- /dev/null +++ b/test/integration/tests/1438-configure-options/Main.hs @@ -0,0 +1,18 @@ +import StackTest +import Control.Monad (unless) +import Data.Foldable (for_) +import Data.List (isInfixOf) + +main :: IO () +main = do + stack ["clean", "--full"] + let stackYamlFiles = words "stack-locals.yaml stack-everything.yaml stack-targets.yaml stack-name.yaml" + for_ stackYamlFiles $ \stackYaml -> + stackErrStderr ["build", "--stack-yaml", stackYaml] $ \str -> + unless ("this is an invalid option" `isInfixOf` str) $ + error "Configure option is not present" + + stack ["build", "--stack-yaml", "stack-locals.yaml", "acme-dont"] + stack ["build", "--stack-yaml", "stack-targets.yaml", "acme-dont"] + stackErr ["build", "--stack-yaml", "stack-name.yaml", "acme-dont"] + stackErr ["build", "--stack-yaml", "stack-everything.yaml", "acme-dont"] diff --git a/test/integration/tests/1438-configure-options/files/.gitignore b/test/integration/tests/1438-configure-options/files/.gitignore new file mode 100644 index 0000000000..e9c64431ea --- /dev/null +++ b/test/integration/tests/1438-configure-options/files/.gitignore @@ -0,0 +1 @@ +name.cabal diff --git a/test/integration/tests/1438-configure-options/files/package.yaml b/test/integration/tests/1438-configure-options/files/package.yaml new file mode 100644 index 0000000000..13ccbf73ec --- /dev/null +++ b/test/integration/tests/1438-configure-options/files/package.yaml @@ -0,0 +1,5 @@ +name: name +version: 0 + +dependencies: base +library: {} diff --git a/test/integration/tests/1438-configure-options/files/stack-everything.yaml b/test/integration/tests/1438-configure-options/files/stack-everything.yaml new file mode 100644 index 0000000000..1e9cba4d18 --- /dev/null +++ b/test/integration/tests/1438-configure-options/files/stack-everything.yaml @@ -0,0 +1,8 @@ +resolver: ghc-8.2.2 + +extra-deps: +- acme-dont-1.1@rev:0 + +configure-options: + $everything: + - this is an invalid option diff --git a/test/integration/tests/1438-configure-options/files/stack-locals.yaml b/test/integration/tests/1438-configure-options/files/stack-locals.yaml new file mode 100644 index 0000000000..9e7c4215bc --- /dev/null +++ b/test/integration/tests/1438-configure-options/files/stack-locals.yaml @@ -0,0 +1,8 @@ +resolver: ghc-8.2.2 + +extra-deps: +- acme-dont-1.1@rev:0 + +configure-options: + $locals: + - this is an invalid option diff --git a/test/integration/tests/1438-configure-options/files/stack-name.yaml b/test/integration/tests/1438-configure-options/files/stack-name.yaml new file mode 100644 index 0000000000..98f92afe69 --- /dev/null +++ b/test/integration/tests/1438-configure-options/files/stack-name.yaml @@ -0,0 +1,10 @@ +resolver: ghc-8.2.2 + +extra-deps: +- acme-dont-1.1@rev:0 + +configure-options: + name: + - this is an invalid option + acme-dont: + - this is an invalid option diff --git a/test/integration/tests/1438-configure-options/files/stack-targets.yaml b/test/integration/tests/1438-configure-options/files/stack-targets.yaml new file mode 100644 index 0000000000..7705a92050 --- /dev/null +++ b/test/integration/tests/1438-configure-options/files/stack-targets.yaml @@ -0,0 +1,8 @@ +resolver: ghc-8.2.2 + +extra-deps: +- acme-dont-1.1@rev:0 + +configure-options: + $targets: + - this is an invalid option diff --git a/test/integration/tests/4220-freeze-command/Main.hs b/test/integration/tests/4220-freeze-command/Main.hs index 15c79d0ab6..cac220e80a 100644 --- a/test/integration/tests/4220-freeze-command/Main.hs +++ b/test/integration/tests/4220-freeze-command/Main.hs @@ -9,9 +9,9 @@ main = do stackCheckStdout ["freeze"] $ \stdOut -> do let contents = fromList [ "resolver:", - "size: 527165", + "size: 527200", "url: https://raw.githubusercontent.com/commercialhaskell/stackage-snapshots/master/lts/11/19.yaml", - "sha256: 0116ad1779b20ad2c9d6620f172531f13b12bb69867e78f4277157e28865dfd4", + "sha256: 16758b43c10c731bc142fdc5c005795db8338d7b4a28cd0af6730d739af2b306", "extra-deps:", "pantry-tree:", "hackage: a50-0.5@sha256:b8dfcc13dcbb12e444128bb0e17527a2a7a9bd74ca9450d6f6862c4b394ac054,1491", diff --git a/test/integration/tests/lock-files/Main.hs b/test/integration/tests/lock-files/Main.hs new file mode 100644 index 0000000000..8f7c89700f --- /dev/null +++ b/test/integration/tests/lock-files/Main.hs @@ -0,0 +1,17 @@ +import Control.Monad (unless, when) +import Data.List (isInfixOf) +import StackTest +import System.Directory + +main :: IO () +main = do + copyFile "stack-2-extras" "stack.yaml" + stack ["build"] + lock1 <- readFile "stack.yaml.lock" + unless ("acme-dont" `isInfixOf` lock1) $ + error "Package acme-dont wasn't found in Stack lock file" + copyFile "stack-1-extra" "stack.yaml" + stack ["build"] + lock2 <- readFile "stack.yaml.lock" + when ("acme-dont" `isInfixOf` lock2) $ + error "Package acme-dont shouldn't be in Stack lock file anymore" diff --git a/test/integration/tests/lock-files/files/Lib.hs b/test/integration/tests/lock-files/files/Lib.hs new file mode 100644 index 0000000000..a3b82e6e83 --- /dev/null +++ b/test/integration/tests/lock-files/files/Lib.hs @@ -0,0 +1,2 @@ +foo :: Int +foo = 42 diff --git a/test/integration/tests/lock-files/files/package.yaml b/test/integration/tests/lock-files/files/package.yaml new file mode 100644 index 0000000000..36e02ec5e7 --- /dev/null +++ b/test/integration/tests/lock-files/files/package.yaml @@ -0,0 +1,4 @@ +name: example +library: + dependencies: + - base diff --git a/test/integration/tests/lock-files/files/stack-1-extra b/test/integration/tests/lock-files/files/stack-1-extra new file mode 100644 index 0000000000..94527115ec --- /dev/null +++ b/test/integration/tests/lock-files/files/stack-1-extra @@ -0,0 +1,3 @@ +resolver: lts-11.22 +extra-deps: +- acme-cuteboy-0.1.0.0 diff --git a/test/integration/tests/lock-files/files/stack-2-extras b/test/integration/tests/lock-files/files/stack-2-extras new file mode 100644 index 0000000000..5415f52ee4 --- /dev/null +++ b/test/integration/tests/lock-files/files/stack-2-extras @@ -0,0 +1,4 @@ +resolver: lts-11.22 +extra-deps: +- acme-cuteboy-0.1.0.0 +- acme-dont-1.1