diff --git a/.github/workflows/tests.yml b/.github/workflows/tests.yml
index 22d1692e3..b7c85192d 100644
--- a/.github/workflows/tests.yml
+++ b/.github/workflows/tests.yml
@@ -2,7 +2,8 @@ name: "Tests + Compile"
on:
pull_request:
branches: [master]
-
+ push:
+ branches: 'master'
jobs:
tests:
runs-on: ARM64
diff --git a/.gitignore b/.gitignore
index e8d960f41..548b8c738 100644
--- a/.gitignore
+++ b/.gitignore
@@ -22,3 +22,6 @@ devenv.local.nix
result*
.idea
+
+# Test folders
+static/Test.FileStorage.ControllerFunctionsSpec
\ No newline at end of file
diff --git a/Guide/deployment.markdown b/Guide/deployment.markdown
index 21180735d..ba10732c2 100644
--- a/Guide/deployment.markdown
+++ b/Guide/deployment.markdown
@@ -12,6 +12,35 @@ AWS EC2 is a good choice for deploying IHP in a professional setup.
### AWS infrastructure preparation
+#### Creating infrastructure with Terraform
+
+The EC2 instance, RDS database, VPS, subnets, security groups, etc, can be setup automatically using [Terraform](https://www.terraform.io/).
+
+1. Install terraform
+2. Setup AWS credentials in `.aws/config` and `.aws/credentials`
+3. Copy the files from the IaC/aws folder from [the branch IaC-aws in ihp-boilerplate](https://github.com/digitallyinduced/ihp-boilerplate/tree/IaC-aws) to you IHP project repo. Run the init command from the IaC/aws folder:
+ ```
+ terraform init
+ ```
+4. Create the file `terraform.tfvars` with the following content:
+ ```
+ prefix = "Project prefix for the resource names"
+ region = "AWS Region to deploy to"
+ az_1 = "Availability Zone 1"
+ az_2 = "Availability Zone 2"
+ key_name = "The key name of the SSH key-pair"
+ db_password = "The password for the RDS database"
+ ```
+ - The two AZs are needed to setup the RDS database.
+ - The SSH key-pair should be created in the AWS web interface.
+5. Run:
+ ```
+ terraform apply
+ ```
+6. Important data like the RDS endpoint and the EC2 instance URL is written to the file `db_info.txt`
+
+Now the NixOS instance and Postgres database is setup and an SSH conncetion can be established to it.
+
#### Creating a new EC2 Instance
Start a new EC2 instance and use the official NixOS AMI `NixOS-23.05.426.afc48694f2a-x86_64-linux`. You can find the latest NixOS AMI at https://nixos.org/download#nixos-amazon
diff --git a/Guide/hsx.markdown b/Guide/hsx.markdown
index f58a933d1..4fe57a38c 100644
--- a/Guide/hsx.markdown
+++ b/Guide/hsx.markdown
@@ -278,6 +278,102 @@ The underlying HTML library blaze currently does not support an empty HTML attri
If you use HTML entities, such as ` ` for a non-breaking space, you will notice they appear exactly like that. To output directly (i.e. unescaped) use the method `preEscapedToMarkup` from `Text.Blaze.Html5`.
+### Custom HSX and Unchecked HSX
+
+HSX provides two additional QuasiQuoters beyond the standard `[hsx|...|]` for increased flexibility: `uncheckedHsx` and `customHsx`.
+
+#### Using `uncheckedHsx`
+
+`uncheckedHsx` provides a quick way to bypass HSX's strict tag and attribute name checking.
+
+It will still check for a valid HTML structure, but it will accept any tag and attribute names.
+
+
+```haskell
+[uncheckedHsx|
+
+ Content
+
+|]
+```
+
+While convenient for rapid development, use it with caution as you lose the benefits of compile-time guarantees for your markup.
+
+#### Using `customHsx`
+
+`customHsx` allows you to extend the default HSX with additional whitelisted tag names and attribute names while maintaining the same strict compile-time checking of the default `hsx`.
+
+This makes it easier to use custom elements that often also contain special attributes, and javascript libraries, for example `_hyperscript`, that use the `_` as an attribute name.
+
+
+To use `customHsx`, you need to create it in a separate module due to Template Haskell restrictions. Here's how to set it up:
+
+1. First, create a new module for your custom HSX (e.g., `Application.Helper.CustomHsx`):
+
+```haskell
+module Application.Helper.CustomHsx where
+
+import IHP.Prelude
+import IHP.HSX.QQ (customHsx)
+import IHP.HSX.Parser
+import Language.Haskell.TH.Quote
+import qualified Data.Set as Set
+
+myHsx :: QuasiQuoter
+myHsx = customHsx
+ (HsxSettings
+ { checkMarkup = True
+ , additionalTagNames = Set.fromList ["book", "heading", "name"]
+ , additionalAttributeNames = Set.fromList ["_", "custom-attribute"]
+ }
+ )
+```
+
+Configuration options for `HsxSettings`:
+- `checkMarkup`: Boolean to enable/disable markup checking
+- `additionalTagNames`: Set of additional allowed tag names
+- `additionalAttributeNames`: Set of additional allowed attribute names
+
+2. Make it available in your views by adding it to your view helpers module:
+
+```haskell
+module Application.Helper.View (
+ module Application.Helper.View,
+ module Application.Helper.CustomHsx -- Add this line
+) where
+
+import IHP.ViewPrelude
+import Application.Helper.CustomHsx (myHsx) -- Add this line
+```
+
+3. Use it in your views:
+
+```haskell
+[myHsx|
+
+ My Book
+ Author Name
+
+|]
+```
+
+The custom HSX will validate that tags and attributes are either in the default HSX whitelist or in your additional sets. This gives you the flexibility to use custom elements and attributes.
+
+This approach is particularly useful for:
+- Web Components with custom attribute names
+- UI libraries with non-standard attributes
+- Domain-specific XML markup languages like [Hyperview](https://hyperview.org/docs/example_navigation)
+- Integration with third-party frameworks that extend HTML syntax
+
+`customHsx` whitelisting and even `uncheckedHsx` does not entirely help for libraries with very unusual symbols in their attributes, like Alpine.js, because they don't recognize html attributes starting with `@` or has `:` in the attribute name. In these cases, the spread syntax `{...attributeList}` is likely your best bet.
+
+```haskell
+-- This will not work
+[uncheckedHsx||]
+
+-- Using spread syntax will work
+[hsx||]
+```
## Common HSX Patterns
diff --git a/Guide/package-management.markdown b/Guide/package-management.markdown
index 3d781b355..0c29b701e 100644
--- a/Guide/package-management.markdown
+++ b/Guide/package-management.markdown
@@ -371,47 +371,53 @@ After that try to run `devenv up`.
### Building Postgres With Extensions
-**TODO: Fix this for IHP v1.1.0**
-
For some applications you may want to install custom postgres extension
libraries and have them available in the nix store.
For example to enable the [postgis](https://postgis.net/) spatial
and geographic objects in PostgreSQL add
-`postgresExtensions = (p: [ p.postgis ]);` to your project's `default.nix` file as
-an attribute of the `"{ihp}/NixSupport/default.nix"` expression.
+`services.postgres.extensions = extensions: [ extensions.postgis ];` to your project's `flake.nix`:
```nix
-let
- ihp = builtins.fetchGit {
- url = "https://github.com/digitallyinduced/ihp.git";
- rev = "c6d40612697bb7905802f23b7753702d33b9e2c1";
- };
- haskellEnv = import "${ihp}/NixSupport/default.nix" {
- ihp = ihp;
- postgresExtensions = (p: [ p.postgis ]);
- haskellDeps = p: with p; [
- cabal-install
- base
- wai
- text
- hlint
- p.ihp
- google-oauth2
- ];
- otherDeps = p: with p; [
- ];
- projectPath = ./.;
+{
+ inputs = {
+ ihp.url = "path:///Users/marc/digitallyinduced/ihp";
+ ihp.inputs.nixpkgs.url = "github:mpscholten/nixpkgs/fix-th-desugar";
+ nixpkgs.follows = "ihp/nixpkgs";
+ flake-parts.follows = "ihp/flake-parts";
+ devenv.follows = "ihp/devenv";
+ systems.follows = "ihp/systems";
};
-in
- haskellEnv
+
+ outputs = inputs@{ ihp, flake-parts, systems, nixpkgs, ... }:
+ flake-parts.lib.mkFlake { inherit inputs; } {
+ systems = import systems;
+ imports = [ ihp.flakeModules.default ];
+
+ perSystem = { pkgs, ... }: {
+ ihp = {
+ inherit appName;
+ enable = true;
+ projectPath = ./.;
+ packages = with pkgs; [];
+ haskellPackages = p: with p; [
+ # ...
+ ];
+ };
+ devenv.shells.default = {
+ services.postgres.extensions = extensions: [ extensions.postgis ];
+ };
+ };
+ };
+}
+
```
Behind the scenes this will pass a function to the postgres nix expressions `postgresql.withPackages`
function making the extension in your app's nix store postgres package.
-After the install you can run `create extension postgis;` to enable all the features of the
+After the install you can run `CREATE EXTENSION postgis;` to enable all the features of the
installed extension.
### Stopping Nix From Running Tests for a Haskell Dependency
diff --git a/Guide/testing.markdown b/Guide/testing.markdown
index e9b3ced68..93792a3e9 100644
--- a/Guide/testing.markdown
+++ b/Guide/testing.markdown
@@ -303,55 +303,4 @@ For more details on how to structure test suites see the [Hspec manual](http://h
## GitHub Actions
-The following GitHub Action workflow can be used to run the tests on CI:
-
-```yaml
-# .github/workflows/test.yml
-
-name: Test
-
-# Controls when the workflow will run
-on:
- # Triggers the workflow on push or pull request events but only for the main branch
- push:
- branches: [ main ]
- pull_request:
- branches: [ main ]
-
- # Allows you to run this workflow manually from the Actions tab
- workflow_dispatch:
-
-# A workflow run is made up of one or more jobs that can run sequentially or in parallel
-jobs:
- tests:
- name: Run Tests
- runs-on: ubuntu-latest
- steps:
- - uses: actions/checkout@v3
- - uses: cachix/install-nix-action@v27
- with:
- nix_path: nixpkgs=https://github.com/NixOS/nixpkgs/archive/51bcdc4cdaac48535dabf0ad4642a66774c609ed.tar.gz
-
- # Use the cachix cache for faster builds.
- - name: Cachix Init
- uses: cachix/cachix-action@v15
- with:
- name: digitallyinduced
- skipPush: true
-
- # Install direnv, which also `direnv allow`s the project.
- - uses: HatsuneMiku3939/direnv-action@v1.0.7
- with:
- direnvVersion: 2.32.3
-
- - name: Run project and tests
- run: |
- # Build generated files.
- nix-shell --run "make build/Generated/Types.hs"
-
- # Start the project in the background.
- nix-shell --run "devenv up &"
-
- # Execute the tests.
- nix-shell --run "runghc $(make print-ghc-extensions) -i. -ibuild -iConfig Test/Main.hs"
-```
+A GitHub Action workflow can be used to run the tests on CI and do deployments. Consult the [IHP Boilerplate example](https://github.com/digitallyinduced/ihp-boilerplate/blob/master/.github/workflows/test.yml) for more details.
diff --git a/Guide/your-first-project.markdown b/Guide/your-first-project.markdown
index e33b8a126..24b211a60 100644
--- a/Guide/your-first-project.markdown
+++ b/Guide/your-first-project.markdown
@@ -20,7 +20,7 @@ The first time you set up IHP, this command might take 10 - 15 minutes to instal
If you don't already use cachix, you will be prompted to install it. You don't need it, but it is highly recommended, as it reduces build time dramatically. Learn more about cachix [here](https://cachix.org/).
-While the build is running, take a look at ["What Is Nix"](https://engineering.shopify.com/blogs/engineering/what-is-nix) by Shopify to get a general understanding of how Nix works.
+While the build is running, take a look at ["What Is Nix"](https://shopify.engineering/what-is-nix) by Shopify to get a general understanding of how Nix works.
In case some errors appear now or in later steps:
@@ -67,7 +67,7 @@ cd blog
Start the development server by running the following in the `blog` directory:
```bash
-./start
+devenv up
```
Your application is starting now. The development server will automatically launch the built-in IDE.
diff --git a/IHP/DataSync/ChangeNotifications.hs b/IHP/DataSync/ChangeNotifications.hs
index f492248a8..8a38d59d4 100644
--- a/IHP/DataSync/ChangeNotifications.hs
+++ b/IHP/DataSync/ChangeNotifications.hs
@@ -41,7 +41,7 @@ createNotificationFunction :: RLS.TableWithRLS -> PG.Query
createNotificationFunction table = [i|
DO $$
BEGIN
- CREATE FUNCTION #{functionName}() RETURNS TRIGGER AS $BODY$
+ CREATE FUNCTION "#{functionName}"() RETURNS TRIGGER AS $BODY$
DECLARE
payload TEXT;
large_pg_notification_id UUID;
@@ -86,24 +86,31 @@ createNotificationFunction table = [i|
RETURN new;
END;
$BODY$ language plpgsql;
- DROP TRIGGER IF EXISTS #{insertTriggerName} ON #{tableName};
- DROP TRIGGER IF EXISTS #{updateTriggerName} ON #{tableName};
- DROP TRIGGER IF EXISTS #{deleteTriggerName} ON #{tableName};
+ DROP TRIGGER IF EXISTS "#{insertTriggerName}" ON "#{tableName}";
+ DROP TRIGGER IF EXISTS "#{updateTriggerName}" ON "#{tableName}";
+ DROP TRIGGER IF EXISTS "#{deleteTriggerName}" ON "#{tableName}";
- CREATE TRIGGER #{insertTriggerName} AFTER INSERT ON "#{tableName}" FOR EACH ROW EXECUTE PROCEDURE #{functionName}();
- CREATE TRIGGER #{updateTriggerName} AFTER UPDATE ON "#{tableName}" FOR EACH ROW EXECUTE PROCEDURE #{functionName}();
- CREATE TRIGGER #{deleteTriggerName} AFTER DELETE ON "#{tableName}" FOR EACH ROW EXECUTE PROCEDURE #{functionName}();
+ CREATE TRIGGER "#{insertTriggerName}" AFTER INSERT ON "#{tableName}" FOR EACH ROW EXECUTE PROCEDURE "#{functionName}"();
+ CREATE TRIGGER "#{updateTriggerName}" AFTER UPDATE ON "#{tableName}" FOR EACH ROW EXECUTE PROCEDURE "#{functionName}"();
+ CREATE TRIGGER "#{deleteTriggerName}" AFTER DELETE ON "#{tableName}" FOR EACH ROW EXECUTE PROCEDURE "#{functionName}"();
EXCEPTION
WHEN duplicate_function THEN
null;
- CREATE UNLOGGED TABLE IF NOT EXISTS large_pg_notifications (
- id UUID DEFAULT uuid_generate_v4() PRIMARY KEY NOT NULL,
- payload TEXT DEFAULT null,
- created_at TIMESTAMP WITH TIME ZONE DEFAULT now() NOT NULL
- );
- CREATE INDEX IF NOT EXISTS large_pg_notifications_created_at_index ON large_pg_notifications (created_at);
+ IF NOT EXISTS (
+ SELECT FROM pg_catalog.pg_class c
+ JOIN pg_catalog.pg_namespace n ON n.oid = c.relnamespace
+ WHERE c.relname = 'large_pg_notifications'
+ AND n.nspname = 'public'
+ ) THEN
+ CREATE UNLOGGED TABLE large_pg_notifications (
+ id UUID DEFAULT uuid_generate_v4() PRIMARY KEY NOT NULL,
+ payload TEXT DEFAULT NULL,
+ created_at TIMESTAMP WITH TIME ZONE DEFAULT now() NOT NULL
+ );
+ CREATE INDEX large_pg_notifications_created_at_index ON large_pg_notifications (created_at);
+ END IF;
END; $$
|]
diff --git a/IHP/DataSync/Controller.hs b/IHP/DataSync/Controller.hs
index 6cd6daeff..85790b5fb 100644
--- a/IHP/DataSync/Controller.hs
+++ b/IHP/DataSync/Controller.hs
@@ -8,6 +8,7 @@ import IHP.DataSync.RowLevelSecurity
import qualified Database.PostgreSQL.Simple.ToField as PG
import qualified IHP.DataSync.ChangeNotifications as ChangeNotifications
import IHP.DataSync.ControllerImpl (runDataSyncController, cleanupAllSubscriptions)
+import IHP.DataSync.DynamicQueryCompiler (camelCaseRenamer)
instance (
PG.ToField (PrimaryKey (GetTableName CurrentUserRecord))
@@ -21,5 +22,5 @@ instance (
run = do
ensureRLSEnabled <- makeCachedEnsureRLSEnabled
installTableChangeTriggers <- ChangeNotifications.makeCachedInstallTableChangeTriggers
- runDataSyncController ensureRLSEnabled installTableChangeTriggers (receiveData @ByteString) sendJSON (\_ _ -> pure ())
+ runDataSyncController ensureRLSEnabled installTableChangeTriggers (receiveData @ByteString) sendJSON (\_ _ -> pure ()) (\_ -> camelCaseRenamer)
onClose = cleanupAllSubscriptions
diff --git a/IHP/DataSync/ControllerImpl.hs b/IHP/DataSync/ControllerImpl.hs
index 431953432..17fd66217 100644
--- a/IHP/DataSync/ControllerImpl.hs
+++ b/IHP/DataSync/ControllerImpl.hs
@@ -45,11 +45,11 @@ runDataSyncController ::
, Typeable CurrentUserRecord
, HasNewSessionUrl CurrentUserRecord
, Show (PrimaryKey (GetTableName CurrentUserRecord))
- ) => EnsureRLSEnabledFn -> InstallTableChangeTriggerFn -> IO ByteString -> SendJSONFn -> HandleCustomMessageFn -> IO ()
-runDataSyncController ensureRLSEnabled installTableChangeTriggers receiveData sendJSON handleCustomMessage = do
+ ) => EnsureRLSEnabledFn -> InstallTableChangeTriggerFn -> IO ByteString -> SendJSONFn -> HandleCustomMessageFn -> (Text -> Renamer) -> IO ()
+runDataSyncController ensureRLSEnabled installTableChangeTriggers receiveData sendJSON handleCustomMessage renamer = do
setState DataSyncReady { subscriptions = HashMap.empty, transactions = HashMap.empty, asyncs = [] }
- let handleMessage = buildMessageHandler ensureRLSEnabled installTableChangeTriggers sendJSON handleCustomMessage
+ let handleMessage :: DataSyncMessage -> IO () = buildMessageHandler ensureRLSEnabled installTableChangeTriggers sendJSON handleCustomMessage renamer
forever do
message <- Aeson.eitherDecodeStrict' <$> receiveData
@@ -91,17 +91,18 @@ buildMessageHandler ::
, HasNewSessionUrl CurrentUserRecord
, Show (PrimaryKey (GetTableName CurrentUserRecord))
)
- => EnsureRLSEnabledFn -> InstallTableChangeTriggerFn -> SendJSONFn -> HandleCustomMessageFn -> (DataSyncMessage -> IO ())
-buildMessageHandler ensureRLSEnabled installTableChangeTriggers sendJSON handleCustomMessage = handleMessage
+ => EnsureRLSEnabledFn -> InstallTableChangeTriggerFn -> SendJSONFn -> HandleCustomMessageFn -> (Text -> Renamer) -> (DataSyncMessage -> IO ())
+buildMessageHandler ensureRLSEnabled installTableChangeTriggers sendJSON handleCustomMessage renamer = handleMessage
where
pgListener = ?applicationContext.pgListener
handleMessage :: DataSyncMessage -> IO ()
handleMessage DataSyncQuery { query, requestId, transactionId } = do
ensureRLSEnabled (query.table)
- let (theQuery, theParams) = compileQuery query
+ let (theQuery, theParams) = compileQueryWithRenamer (renamer query.table) query
- result :: [[Field]] <- sqlQueryWithRLSAndTransactionId transactionId theQuery theParams
+ rawResult :: [[Field]] <- sqlQueryWithRLSAndTransactionId transactionId theQuery theParams
+ let result = map (map (renameField (renamer query.table))) rawResult
sendJSON DataSyncResult { result, requestId }
@@ -118,16 +119,17 @@ buildMessageHandler ensureRLSEnabled installTableChangeTriggers sendJSON handleC
close <- MVar.newEmptyMVar
atomicModifyIORef'' ?state (\state -> state |> modify #subscriptions (HashMap.insert subscriptionId close))
- let (theQuery, theParams) = compileQuery query
+ let (theQuery, theParams) = compileQueryWithRenamer (renamer query.table) query
- result :: [[Field]] <- sqlQueryWithRLS theQuery theParams
+ rawResult :: [[Field]] <- sqlQueryWithRLS theQuery theParams
+ let result = map (map (renameField (renamer query.table))) rawResult
let tableName = query.table
-- We need to keep track of all the ids of entities we're watching to make
-- sure that we only send update notifications to clients that can actually
-- access the record (e.g. if a RLS policy denies access)
- let watchedRecordIds = recordIds result
+ let watchedRecordIds = recordIds rawResult
-- Store it in IORef as an INSERT requires us to add an id
watchedRecordIdsRef <- newIORef (Set.fromList watchedRecordIds)
@@ -149,11 +151,12 @@ buildMessageHandler ensureRLSEnabled installTableChangeTriggers sendJSON handleC
newRecord :: [[Field]] <- sqlQueryWithRLS ("SELECT * FROM (" <> theQuery <> ") AS records WHERE records.id = ? LIMIT 1") (theParams <> [PG.toField id])
case headMay newRecord of
- Just record -> do
+ Just rawRecord -> do
-- Add the new record to 'watchedRecordIdsRef'
-- Otherwise the updates and deletes will not be dispatched to the client
modifyIORef' watchedRecordIdsRef (Set.insert id)
+ let record = map (renameField (renamer tableName)) rawRecord
sendJSON DidInsert { subscriptionId, record }
Nothing -> pure ()
ChangeNotifications.DidUpdate { id, changeSet } -> do
@@ -167,7 +170,7 @@ buildMessageHandler ensureRLSEnabled installTableChangeTriggers sendJSON handleC
changes <- ChangeNotifications.retrieveChanges changeSet
if isRecordInResultSet
- then sendJSON DidUpdate { subscriptionId, id, changeSet = changesToValue changes }
+ then sendJSON DidUpdate { subscriptionId, id, changeSet = changesToValue (renamer tableName) changes }
else sendJSON DidDelete { subscriptionId, id }
ChangeNotifications.DidDelete { id } -> do
-- Only send the notifcation if the deleted record was part of the initial
@@ -202,7 +205,7 @@ buildMessageHandler ensureRLSEnabled installTableChangeTriggers sendJSON handleC
let query = "INSERT INTO ? ? VALUES ? RETURNING *"
let columns = record
|> HashMap.keys
- |> map fieldNameToColumnName
+ |> map (renamer table).fieldToColumn
let values = record
|> HashMap.elems
@@ -213,7 +216,11 @@ buildMessageHandler ensureRLSEnabled installTableChangeTriggers sendJSON handleC
result :: [[Field]] <- sqlQueryWithRLSAndTransactionId transactionId query params
case result of
- [record] -> sendJSON DidCreateRecord { requestId, record }
+ [rawRecord] ->
+ let
+ record = map (renameField (renamer table)) rawRecord
+ in
+ sendJSON DidCreateRecord { requestId, record }
otherwise -> error "Unexpected result in CreateRecordMessage handler"
pure ()
@@ -228,7 +235,7 @@ buildMessageHandler ensureRLSEnabled installTableChangeTriggers sendJSON handleC
Just value -> value
Nothing -> error "Atleast one record is required"
|> HashMap.keys
- |> map fieldNameToColumnName
+ |> map (renamer table).fieldToColumn
let values = records
|> map (\object ->
@@ -240,7 +247,8 @@ buildMessageHandler ensureRLSEnabled installTableChangeTriggers sendJSON handleC
let params = (PG.Identifier table, PG.In (map PG.Identifier columns), PG.Values [] values)
- records :: [[Field]] <- sqlQueryWithRLSAndTransactionId transactionId query params
+ rawRecords :: [[Field]] <- sqlQueryWithRLSAndTransactionId transactionId query params
+ let records = map (map (renameField (renamer table))) rawRecords
sendJSON DidCreateRecords { requestId, records }
@@ -251,7 +259,7 @@ buildMessageHandler ensureRLSEnabled installTableChangeTriggers sendJSON handleC
let columns = patch
|> HashMap.keys
- |> map fieldNameToColumnName
+ |> map (renamer table).fieldToColumn
|> map PG.Identifier
let values = patch
@@ -272,7 +280,11 @@ buildMessageHandler ensureRLSEnabled installTableChangeTriggers sendJSON handleC
result :: [[Field]] <- sqlQueryWithRLSAndTransactionId transactionId (PG.Query query) params
case result of
- [record] -> sendJSON DidUpdateRecord { requestId, record }
+ [rawRecord] ->
+ let
+ record = map (renameField (renamer table)) rawRecord
+ in
+ sendJSON DidUpdateRecord { requestId, record }
otherwise -> error "Could not apply the update to the given record. Are you sure the record ID you passed is correct? If the record ID is correct, likely the row level security policy is not making the record visible to the UPDATE operation."
pure ()
@@ -282,7 +294,7 @@ buildMessageHandler ensureRLSEnabled installTableChangeTriggers sendJSON handleC
let columns = patch
|> HashMap.keys
- |> map fieldNameToColumnName
+ |> map (renamer table).fieldToColumn
|> map PG.Identifier
let values = patch
@@ -300,7 +312,8 @@ buildMessageHandler ensureRLSEnabled installTableChangeTriggers sendJSON handleC
<> (join (map (\(key, value) -> [PG.toField key, value]) keyValues))
<> [PG.toField (PG.In ids)]
- records <- sqlQueryWithRLSAndTransactionId transactionId (PG.Query query) params
+ rawRecords <- sqlQueryWithRLSAndTransactionId transactionId (PG.Query query) params
+ let records = map (map (renameField (renamer table))) rawRecords
sendJSON DidUpdateRecords { requestId, records }
@@ -380,10 +393,10 @@ cleanupAllSubscriptions = do
DataSyncReady { asyncs } -> forEach asyncs uninterruptibleCancel
_ -> pure ()
-changesToValue :: [ChangeNotifications.Change] -> Value
-changesToValue changes = object (map changeToPair changes)
+changesToValue :: Renamer -> [ChangeNotifications.Change] -> Value
+changesToValue renamer changes = object (map changeToPair changes)
where
- changeToPair ChangeNotifications.Change { col, new } = (Aeson.fromText $ columnNameToFieldName col) .= new
+ changeToPair ChangeNotifications.Change { col, new } = (Aeson.fromText $ renamer.columnToField col) .= new
runInModelContextWithTransaction :: (?state :: IORef DataSyncController, ?modelContext :: ModelContext) => ((?modelContext :: ModelContext) => IO result) -> Maybe UUID -> IO result
runInModelContextWithTransaction function (Just transactionId) = do
diff --git a/IHP/DataSync/DynamicQuery.hs b/IHP/DataSync/DynamicQuery.hs
index 9f5c938ec..2ec915e5c 100644
--- a/IHP/DataSync/DynamicQuery.hs
+++ b/IHP/DataSync/DynamicQuery.hs
@@ -127,7 +127,7 @@ instance PG.FromField Field where
pure Field { .. }
where
fieldName = (PG.name field)
- |> fmap (columnNameToFieldName . cs)
+ |> fmap cs
|> fromMaybe ""
instance PG.FromField DynamicValue where
diff --git a/IHP/DataSync/DynamicQueryCompiler.hs b/IHP/DataSync/DynamicQueryCompiler.hs
index db50078ea..f1831d406 100644
--- a/IHP/DataSync/DynamicQueryCompiler.hs
+++ b/IHP/DataSync/DynamicQueryCompiler.hs
@@ -13,8 +13,43 @@ import qualified Database.PostgreSQL.Simple.ToField as PG
import qualified Database.PostgreSQL.Simple.Types as PG
import qualified Data.List as List
+data Renamer = Renamer
+ { fieldToColumn :: Text -> Text
+ , columnToField :: Text -> Text
+ }
+
compileQuery :: DynamicSQLQuery -> (PG.Query, [PG.Action])
-compileQuery DynamicSQLQuery { .. } = (sql, args)
+compileQuery = compileQueryWithRenamer camelCaseRenamer
+
+compileQueryWithRenamer :: Renamer -> DynamicSQLQuery -> (PG.Query, [PG.Action])
+compileQueryWithRenamer renamer query = compileQueryMapped (mapColumnNames renamer.fieldToColumn query)
+
+-- | Default renamer used by DataSync.
+--
+-- Transforms JS inputs in @camelCase@ to snake_case for the database
+-- and DB outputs in @snake_case@ back to @camelCase@
+camelCaseRenamer :: Renamer
+camelCaseRenamer =
+ Renamer
+ { fieldToColumn = fieldNameToColumnName
+ , columnToField = columnNameToFieldName
+ }
+
+-- | Renamer that does not modify the column names
+unmodifiedRenamer :: Renamer
+unmodifiedRenamer =
+ Renamer
+ { fieldToColumn = id
+ , columnToField = id
+ }
+
+-- | When a Field is retrieved from the database, it's all in @snake_case@. This turns it into @camelCase@
+renameField :: Renamer -> Field -> Field
+renameField renamer field =
+ field { fieldName = renamer.columnToField field.fieldName }
+
+compileQueryMapped :: DynamicSQLQuery -> (PG.Query, [PG.Action])
+compileQueryMapped DynamicSQLQuery { .. } = (sql, args)
where
sql = "SELECT" <> distinctOnSql <> "? FROM ?" <> whereSql <> orderBySql <> limitSql <> offsetSql
args = distinctOnArgs
@@ -28,7 +63,7 @@ compileQuery DynamicSQLQuery { .. } = (sql, args)
<> offsetArgs
(distinctOnSql, distinctOnArgs) = case distinctOnColumn of
- Just column -> (" DISTINCT ON (?) ", [PG.toField $ PG.Identifier (fieldNameToColumnName $ cs column)])
+ Just column -> (" DISTINCT ON (?) ", [PG.toField $ PG.Identifier (cs column)])
Nothing -> (" ", [])
(orderBySql, orderByArgs) = case orderByClause of
@@ -38,13 +73,13 @@ compileQuery DynamicSQLQuery { .. } = (sql, args)
, orderByClauses
|> map (\case
OrderByClause { orderByColumn, orderByDirection } ->
- [ PG.toField $ PG.Identifier (fieldNameToColumnName $ cs orderByColumn)
+ [ PG.toField $ PG.Identifier (cs orderByColumn)
, PG.toField $ if orderByDirection == QueryBuilder.Desc
then PG.Plain "DESC"
else PG.Plain ""
]
OrderByTSRank { tsvector, tsquery } ->
- [ PG.toField $ PG.Identifier (fieldNameToColumnName tsvector)
+ [ PG.toField $ PG.Identifier tsvector
, PG.toField tsquery
]
)
@@ -63,6 +98,29 @@ compileQuery DynamicSQLQuery { .. } = (sql, args)
Just offset -> (" OFFSET ?", [PG.toField offset])
Nothing -> ("", [])
+-- | Used to transform column names from @camelCase@ to @snake_case@
+mapColumnNames :: (Text -> Text) -> DynamicSQLQuery -> DynamicSQLQuery
+mapColumnNames rename query =
+ query
+ { selectedColumns = mapSelectedColumns query.selectedColumns
+ , whereCondition = mapConditionExpression <$> query.whereCondition
+ , orderByClause = map mapOrderByClause query.orderByClause
+ , distinctOnColumn = (cs . rename . cs) <$> query.distinctOnColumn
+ }
+ where
+ mapSelectedColumns :: SelectedColumns -> SelectedColumns
+ mapSelectedColumns SelectAll = SelectAll
+ mapSelectedColumns (SelectSpecific columns) = SelectSpecific (map rename columns)
+
+ mapConditionExpression :: ConditionExpression -> ConditionExpression
+ mapConditionExpression ColumnExpression { field } = ColumnExpression { field = rename field }
+ mapConditionExpression InfixOperatorExpression { left, op, right } = InfixOperatorExpression { left = mapConditionExpression left, op, right = mapConditionExpression right }
+ mapConditionExpression otherwise = otherwise
+
+ mapOrderByClause :: OrderByClause -> OrderByClause
+ mapOrderByClause OrderByClause { orderByColumn, orderByDirection } = OrderByClause { orderByColumn = cs (rename (cs orderByColumn)), orderByDirection }
+ mapOrderByClause otherwise = otherwise
+
compileOrderByClause :: OrderByClause -> Text
compileOrderByClause OrderByClause {} = "? ?"
compileOrderByClause OrderByTSRank { tsvector, tsquery } = "ts_rank(?, to_tsquery('english', ?))"
@@ -79,7 +137,7 @@ compileSelectedColumns (SelectSpecific fields) = PG.Many args
-- TODO: validate query against schema
compileCondition :: ConditionExpression -> (PG.Query, [PG.Action])
-compileCondition (ColumnExpression column) = ("?", [PG.toField $ PG.Identifier (fieldNameToColumnName column)])
+compileCondition (ColumnExpression column) = ("?", [PG.toField $ PG.Identifier column])
compileCondition (InfixOperatorExpression a OpEqual (LiteralExpression Null)) = compileCondition (InfixOperatorExpression a OpIs (LiteralExpression Null)) -- Turn 'a = NULL' into 'a IS NULL'
compileCondition (InfixOperatorExpression a OpNotEqual (LiteralExpression Null)) = compileCondition (InfixOperatorExpression a OpIsNot (LiteralExpression Null)) -- Turn 'a <> NULL' into 'a IS NOT NULL'
compileCondition (InfixOperatorExpression a OpIn (ListExpression { values })) | (Null `List.elem` values) =
diff --git a/IHP/DataSync/RowLevelSecurity.hs b/IHP/DataSync/RowLevelSecurity.hs
index d26d63612..42c8a2835 100644
--- a/IHP/DataSync/RowLevelSecurity.hs
+++ b/IHP/DataSync/RowLevelSecurity.hs
@@ -128,7 +128,7 @@ makeCachedEnsureRLSEnabled = do
-- >>> hasRLSEnabled "my_table"
-- True
hasRLSEnabled :: (?modelContext :: ModelContext) => Text -> IO Bool
-hasRLSEnabled table = sqlQueryScalar "SELECT relrowsecurity FROM pg_class WHERE oid = ?::regclass" [table]
+hasRLSEnabled table = sqlQueryScalar "SELECT relrowsecurity FROM pg_class WHERE oid = quote_ident(?)::regclass" [table]
-- | Can be constructed using 'ensureRLSEnabled'
--
diff --git a/IHP/FileStorage/Config.hs b/IHP/FileStorage/Config.hs
index f5a5159fc..83f806577 100644
--- a/IHP/FileStorage/Config.hs
+++ b/IHP/FileStorage/Config.hs
@@ -94,7 +94,9 @@ initMinioStorage server bucket = do
-- > initStaticDirStorage
--
initStaticDirStorage :: State.StateT TMap.TMap IO ()
-initStaticDirStorage = option StaticDirStorage
+initStaticDirStorage = do
+ directory <- EnvVar.envOrDefault "IHP_STORAGE_DIR" "static/"
+ option StaticDirStorage { directory }
-- | The Filebase access key and secret key have to be provided using the @FILEBASE_KEY@ and @FILEBASE_SECRET@ env vars.
--
diff --git a/IHP/FileStorage/ControllerFunctions.hs b/IHP/FileStorage/ControllerFunctions.hs
index 864b9a74e..6eb4cdd56 100644
--- a/IHP/FileStorage/ControllerFunctions.hs
+++ b/IHP/FileStorage/ControllerFunctions.hs
@@ -96,23 +96,23 @@ storeFileWithOptions fileInfo options = do
let fileName = options.fileName |> fromMaybe objectId
- let directory = options.directory
- let objectPath = directory <> "/" <> UUID.toText fileName
+ let objectPath = options.directory <> "/" <> UUID.toText fileName
let preprocess = options.preprocess
fileInfo <- preprocess fileInfo
url <- case storage of
- StaticDirStorage -> do
- let destPath :: Text = "static/" <> objectPath
- Directory.createDirectoryIfMissing True (cs $ "static/" <> directory)
+ StaticDirStorage { directory } -> do
+ let destPath :: Text = directory <> objectPath
+ Directory.createDirectoryIfMissing True (cs $ directory <> options.directory)
fileInfo
|> (.fileContent)
|> LBS.writeFile (cs destPath)
let frameworkConfig = ?context.frameworkConfig
- pure $ frameworkConfig.baseUrl <> "/" <> objectPath
+ -- Prefix with a slash so it can be used in URLs, even if the baseUrl is empty.
+ pure $ "/" <> objectPath
S3Storage { connectInfo, bucket, baseUrl } -> do
let payload = fileInfo
|> (.fileContent)
@@ -224,9 +224,15 @@ createTemporaryDownloadUrlFromPathWithExpiredAt :: (?context :: context, ConfigP
createTemporaryDownloadUrlFromPathWithExpiredAt validInSeconds objectPath = do
publicUrlExpiredAt <- addUTCTime (fromIntegral validInSeconds) <$> getCurrentTime
case storage of
- StaticDirStorage -> do
+ StaticDirStorage {} -> do
let frameworkConfig = ?context.frameworkConfig
- let url = frameworkConfig.baseUrl <> "/" <> objectPath
+ let urlSchemes = ["http://", "https://"]
+
+ let url = if any (`isPrefixOf` objectPath) urlSchemes
+ -- BC, before we saved only the relative path of a file, we saved the full URL. So use it as is.
+ then objectPath
+ -- We have the relative path (prefixed with slash), so add the baseUrl.
+ else frameworkConfig.baseUrl <> objectPath
pure TemporaryDownloadUrl { url = cs url, expiredAt = publicUrlExpiredAt }
S3Storage { connectInfo, bucket} -> do
@@ -391,8 +397,8 @@ uploadToStorage field record = uploadToStorageWithOptions def field record
removeFileFromStorage :: (?context :: context, ConfigProvider context) => StoredFile -> IO (Either MinioErr ())
removeFileFromStorage StoredFile { path, url } = do
case storage of
- StaticDirStorage -> do
- let fullPath :: String = cs $ "static/" <> path
+ StaticDirStorage { directory } -> do
+ let fullPath :: String = cs $ directory <> path
Directory.removeFile fullPath
pure $ Right ()
S3Storage { connectInfo, bucket} -> do
@@ -408,5 +414,5 @@ storage = ?context.frameworkConfig.appConfig
-- | Returns the prefix for the storage. This is either @static/@ or an empty string depending on the storage.
storagePrefix :: (?context :: ControllerContext) => Text
storagePrefix = case storage of
- StaticDirStorage -> "static/"
- _ -> ""
\ No newline at end of file
+ StaticDirStorage { directory } -> directory
+ _ -> ""
diff --git a/IHP/FileStorage/Types.hs b/IHP/FileStorage/Types.hs
index 61962f5b8..320871c69 100644
--- a/IHP/FileStorage/Types.hs
+++ b/IHP/FileStorage/Types.hs
@@ -14,7 +14,7 @@ import qualified Network.Minio as Minio
import qualified Network.Wai.Parse as Wai
data FileStorage
- = StaticDirStorage -- ^ Stores files publicly visible inside the project's @static@ directory
+ = StaticDirStorage { directory :: !Text } -- ^ Stores files publicly visible inside the project's @static@ directory
| S3Storage { connectInfo :: Minio.ConnectInfo, bucket :: Text, baseUrl :: Text } -- ^ Stores files inside a S3 compatible cloud storage
-- | Result of a 'storeFile' operation
diff --git a/IHP/NameSupport.hs b/IHP/NameSupport.hs
index 2d9c0ef73..e4e9e9c22 100644
--- a/IHP/NameSupport.hs
+++ b/IHP/NameSupport.hs
@@ -13,6 +13,7 @@ module IHP.NameSupport
, fieldNameToColumnName
, escapeHaskellKeyword
, tableNameToControllerName
+, tableNameToViewName
, enumValueToControllerName
, toSlug
, module IHP.NameSupport.Inflections
@@ -67,6 +68,17 @@ tableNameToControllerName tableName = do
else ucfirst tableName
{-# INLINABLE tableNameToControllerName #-}
+-- | Transforms an underscore table name to a name for a view
+--
+-- >>> tableNameToViewName "users"
+--
+-- >>> tableNameToViewName "projects"
+--
+-- >>> tableNameToViewName "user_projects"
+tableNameToViewName :: Text -> Text
+tableNameToViewName = tableNameToControllerName
+{-# INLINABLE tableNameToViewName #-}
+
-- | Transforms a enum value to a name for a model
--
-- >>> enumValueToControllerName "happy"
diff --git a/IHP/QueryBuilder.hs b/IHP/QueryBuilder.hs
index fd3c8d093..f07a2e8f0 100644
--- a/IHP/QueryBuilder.hs
+++ b/IHP/QueryBuilder.hs
@@ -165,7 +165,7 @@ class HasQueryBuilder queryBuilderProvider joinRegister | queryBuilderProvider -
injectQueryBuilder :: QueryBuilder table -> queryBuilderProvider table
getQueryIndex :: queryBuilderProvider table -> Maybe ByteString
getQueryIndex _ = Nothing
- {-# INLINE getQueryIndex #-}
+ {-# INLINABLE getQueryIndex #-}
-- Wrapper for QueryBuilders resulting from joins. Associates a joinRegister type.
newtype JoinQueryBuilderWrapper joinRegister table = JoinQueryBuilderWrapper (QueryBuilder table)
@@ -186,24 +186,24 @@ instance HasQueryBuilder QueryBuilder EmptyModelList where
-- JoinQueryBuilderWrappers have query builders
instance HasQueryBuilder (JoinQueryBuilderWrapper joinRegister) joinRegister where
getQueryBuilder (JoinQueryBuilderWrapper queryBuilder) = queryBuilder
- {-# INLINE getQueryBuilder #-}
+ {-# INLINABLE getQueryBuilder #-}
injectQueryBuilder = JoinQueryBuilderWrapper
- {-# INLINE injectQueryBuilder #-}
+ {-# INLINABLE injectQueryBuilder #-}
-- NoJoinQueryBuilderWrapper have query builders and the join register does not allow any joins
instance HasQueryBuilder NoJoinQueryBuilderWrapper NoJoins where
getQueryBuilder (NoJoinQueryBuilderWrapper queryBuilder) = queryBuilder
- {-# INLINE getQueryBuilder #-}
+ {-# INLINABLE getQueryBuilder #-}
injectQueryBuilder = NoJoinQueryBuilderWrapper
- {-# INLINE injectQueryBuilder #-}
+ {-# INLINABLE injectQueryBuilder #-}
instance (KnownSymbol foreignTable, foreignModel ~ GetModelByTableName foreignTable , KnownSymbol indexColumn, HasField indexColumn foreignModel indexValue) => HasQueryBuilder (LabeledQueryBuilderWrapper foreignTable indexColumn indexValue) NoJoins where
getQueryBuilder (LabeledQueryBuilderWrapper queryBuilder) = queryBuilder
- {-# INLINE getQueryBuilder #-}
+ {-# INLINABLE getQueryBuilder #-}
injectQueryBuilder = LabeledQueryBuilderWrapper
- {-# INLINE injectQueryBuilder #-}
+ {-# INLINABLE injectQueryBuilder #-}
getQueryIndex _ = Just $ symbolToByteString @foreignTable <> "." <> (Text.encodeUtf8 . fieldNameToColumnName) (symbolToText @indexColumn)
- {-# INLINE getQueryIndex #-}
+ {-# INLINABLE getQueryIndex #-}
data QueryBuilder (table :: Symbol) =
diff --git a/IHP/View/CSSFramework.hs b/IHP/View/CSSFramework.hs
index 45fcbb77d..62dbe7421 100644
--- a/IHP/View/CSSFramework.hs
+++ b/IHP/View/CSSFramework.hs
@@ -179,7 +179,7 @@ instance Default CSSFramework where
autofocus={autofocus}
{...additionalAttributes}
>
-
+
{forEach (options fieldType) (getOption)}
diff --git a/NixSupport/default.nix b/NixSupport/default.nix
index 1b1f9b801..ac271d43a 100644
--- a/NixSupport/default.nix
+++ b/NixSupport/default.nix
@@ -7,114 +7,180 @@
, otherDeps ? (p: [])
, projectPath ? ./.
, withHoogle ? false
-, postgresExtensions ? (p: [])
, optimized ? false
-, includeDevTools ? !optimized # Include Postgres?
, rtsFlags ? ""
+, appName ? "app"
, optimizationLevel ? "2"
+, filter
}:
let
allHaskellPackages =
- (if withHoogle
- then ghc.ghcWithHoogle
- else ghc.ghcWithPackages) haskellDeps;
+ (if withHoogle
+ then ghc.ghcWithHoogle
+ else ghc.ghcWithPackages) haskellDeps;
allNativePackages = builtins.concatLists [
- (otherDeps pkgs)
+ (otherDeps pkgs)
];
- appBinary = if optimized
- then "build/bin/RunOptimizedProdServer"
- else "build/bin/RunUnoptimizedProdServer";
-
- jobsBinary = if optimized
- then "build/bin/RunJobsOptimized"
- else "build/bin/RunJobs";
+ ihpLibWithMakefile = filter { root = ihp; include = ["lib/IHP/Makefile.dist"]; name = "ihpLibWithMakefile"; };
+ ihpLibWithMakefileAndStatic = filter { root = ihp; include = ["lib/IHP/Makefile.dist" "lib/IHP/static"]; name = "ihpLibWithMakefileAndStatic"; };
+ splitSections = if !pkgs.stdenv.hostPlatform.isDarwin then "-split-sections" else "";
+
+ schemaObjectFiles =
+ let
+ self = projectPath;
+ in
+ pkgs.stdenv.mkDerivation {
+ name = appName + "-schema";
+ buildPhase = ''
+ mkdir -p build/Generated
+ build-generated-code
+
+ export IHP=${ihpLibWithMakefile}/lib/IHP
+ ghc -O${if optimized then optimizationLevel else "0"} ${splitSections} $(make print-ghc-options) --make build/Generated/Types.hs -odir build/RunProdServer -hidir build/RunProdServer
+
+ cp -r build $out
+ '';
+ src = filter { root = self; include = ["Application/Schema.sql" "Makefile"]; name = "schemaObjectFiles-source"; };
+ nativeBuildInputs =
+ [ (ghc.ghcWithPackages (p: [ p.ihp-ide ])) # Needed for build-generated-code
+ ]
+ ;
+ dontInstall = true;
+ dontFixup = false;
+ disallowedReferences = [ ihp ]; # Prevent including the large full IHP source code
+ };
+
+ prodGhcOptions = "-funbox-strict-fields -fconstraint-solver-iterations=100 -fdicts-strict -with-rtsopts=\"${rtsFlags}\"";
+
+ binaries =
+ pkgs.stdenv.mkDerivation {
+ name = appName + "-binaries";
+ buildPhase = ''
+ mkdir -p build/Generated build/RunProdServer
+ cp -r ${schemaObjectFiles}/RunProdServer build/
+ cp -r ${schemaObjectFiles}/Generated build/
+
+ chmod -R +w build/RunProdServer/*
+
+ export IHP_LIB=${ihpLibWithMakefile}/lib/IHP
+ export IHP=${ihpLibWithMakefile}/lib/IHP
+
+ mkdir -p build/bin build/RunUnoptimizedProdServer
+
+ echo ghc -O${if optimized then optimizationLevel else "0"} ${splitSections} $(make print-ghc-options) ${if optimized then prodGhcOptions else ""} Main.hs -o build/bin/RunProdServer -odir build/RunProdServer -hidir build/RunProdServer
+ ghc -O${if optimized then optimizationLevel else "0"} ${splitSections} $(make print-ghc-options) ${if optimized then prodGhcOptions else ""} Main.hs -o build/bin/RunProdServer -odir build/RunProdServer -hidir build/RunProdServer
+
+ # Build job runner if there are any jobs
+ if find -type d -iwholename \*/Job|grep .; then
+ echo "module RunJobs (main) where" > build/RunJobs.hs
+ echo "import Application.Script.Prelude" >> build/RunJobs.hs
+ echo "import IHP.ScriptSupport" >> build/RunJobs.hs
+ echo "import IHP.Job.Runner" >> build/RunJobs.hs
+ echo "import qualified Config" >> build/RunJobs.hs
+ echo "import Main ()" >> build/RunJobs.hs
+ echo "main :: IO ()" >> build/RunJobs.hs
+ echo "main = runScript Config.config (runJobWorkers (workers RootApplication))" >> build/RunJobs.hs
+ ghc -O${if optimized then optimizationLevel else "0"} ${splitSections} -main-is 'RunJobs.main' $(make print-ghc-options) ${if optimized then prodGhcOptions else ""} build/RunJobs.hs -o build/bin/RunJobs -odir build/RunProdServer -hidir build/RunProdServer
+ fi;
+
+ # Build all scripts if there are any
+ mkdir -p Application/Script
+ SCRIPT_TARGETS=`find Application/Script -type f -iwholename '*.hs' -not -name 'Prelude.hs' -exec basename {} .hs ';' | sed 's#^#build/bin/Script/#' | tr "\n" " "`
+ if [[ ! -z "$SCRIPT_TARGETS" ]]; then
+ # Need to use -j1 here to avoid race conditions of temp files created by GHC.
+ #
+ # These errors look like:
+ #
+ # : error:
+ # build/RunUnoptimizedProdServer/Application/Script/Prelude.o.tmp: renameFile:renamePath:rename: does not exist (No such file or directory)
+ #
+ make -j1 $SCRIPT_TARGETS;
+ fi;
+ '';
+ installPhase = ''
+ mkdir -p "$out"
+ mkdir -p $out/bin $out/lib
+
+ mv build/bin/RunProdServer $out/bin/RunProdServer
+
+ # Copy job runner binary to bin/ if we built it
+ if [ -f build/bin/RunJobs ]; then
+ mv build/bin/RunJobs $out/bin/RunJobs;
+ fi;
+
+ # Copy IHP Script binaries to bin/
+ mkdir -p build/bin/Script
+ find build/bin/Script/ -type f -print0 |
+ while read -d $'\0' script; do
+ script_basename=$(basename "$script")
+ mv "build/bin/Script/$script_basename" "$out/bin/$script_basename";
+ done
+ '';
+ src = filter { root = pkgs.nix-gitignore.gitignoreSource [] projectPath; include = [filter.isDirectory "Makefile" (filter.matchExt "hs")]; exclude = ["static" "Frontend"]; name = "${appName}-source"; };
+ buildInputs = [allHaskellPackages];
+ nativeBuildInputs = [ pkgs.makeWrapper schemaObjectFiles];
+ enableParallelBuilding = true;
+ disallowedReferences = [ ihp ]; # Prevent including the large full IHP source code
+ };
in
pkgs.stdenv.mkDerivation {
- name = "app";
+ name = appName;
buildPhase = ''
- runHook preBuild
-
- mkdir -p build
-
- # When npm install is executed by the project's makefile it will fail with:
- #
- # EACCES: permission denied, mkdir '/homeless-shelter'
- #
- # To avoid this error we use /tmp as our home directory for the build
- #
- # See https://github.com/svanderburg/node2nix/issues/217#issuecomment-751311272
- export HOME=/tmp
-
- export IHP_LIB=${ihp}/lib/IHP
- export IHP=${ihp}/lib/IHP
- export APP_RTS_FLAGS="${rtsFlags}"
- export OPTIMIZATION_LEVEL="${optimizationLevel}"
-
- make -j ${appBinary}
-
- # Build job runner if there are any jobs
- if find -type d -iwholename \*/Job|grep .; then
- make -j ${jobsBinary};
- fi;
-
- # Build all scripts if there are any
- mkdir -p Application/Script
- SCRIPT_TARGETS=`find Application/Script -type f -iwholename '*.hs' -not -name 'Prelude.hs' -exec basename {} .hs ';' | sed 's#^#build/bin/Script/#' | tr "\n" " "`
- if [[ ! -z "$SCRIPT_TARGETS" ]]; then
- # Need to use -j1 here to avoid race conditions of temp files created by GHC.
+ runHook preBuild
+
+ # When npm install is executed by the project's makefile it will fail with:
#
- # These errors look like:
+ # EACCES: permission denied, mkdir '/homeless-shelter'
#
- # : error:
- # build/RunUnoptimizedProdServer/Application/Script/Prelude.o.tmp: renameFile:renamePath:rename: does not exist (No such file or directory)
+ # To avoid this error we use /tmp as our home directory for the build
#
- make -j1 $SCRIPT_TARGETS;
- fi;
+ # See https://github.com/svanderburg/node2nix/issues/217#issuecomment-751311272
+ export HOME=/tmp
+
+ export IHP_LIB=${ihpLibWithMakefileAndStatic}/lib/IHP
+ export IHP=${ihpLibWithMakefileAndStatic}/lib/IHP
- runHook postBuild
+ make -j static/app.css static/app.js
+
+ runHook postBuild
'';
installPhase = ''
- runHook preInstall
-
- mkdir -p "$out"
- mkdir -p $out/bin $out/lib
+ runHook preInstall
- mv ${appBinary} $out/bin/RunProdServerWithoutOptions
+ mkdir -p "$out"
+ mkdir -p $out/bin $out/lib
- INPUT_HASH="$((basename $out) | cut -d - -f 1)"
- makeWrapper $out/bin/RunProdServerWithoutOptions $out/bin/RunProdServer --set-default IHP_ASSET_VERSION $INPUT_HASH --set-default IHP_LIB ${ihp}/lib/IHP --run "cd $out/lib" --prefix PATH : ${pkgs.lib.makeBinPath (otherDeps pkgs)}
+ INPUT_HASH="$((basename $out) | cut -d - -f 1)"
+ makeWrapper ${binaries}/bin/RunProdServer $out/bin/RunProdServer --set-default IHP_ASSET_VERSION $INPUT_HASH --set-default IHP_LIB ${ihpLibWithMakefileAndStatic}/lib/IHP --run "cd $out/lib" --prefix PATH : ${pkgs.lib.makeBinPath (otherDeps pkgs)}
- # Copy job runner binary to bin/ if we built it
- if [ -f ${jobsBinary} ]; then
- mv ${jobsBinary} $out/bin/RunJobsWithoutOptions;
- makeWrapper $out/bin/RunJobsWithoutOptions $out/bin/RunJobs --set-default IHP_ASSET_VERSION $INPUT_HASH --set-default IHP_LIB ${ihp}/lib/IHP --run "cd $out/lib" --prefix PATH : ${pkgs.lib.makeBinPath (otherDeps pkgs)}
- fi;
+ # Copy job runner binary to bin/ if we built it
+ if [ -f ${binaries}/bin/RunJobs ]; then
+ makeWrapper ${binaries}/bin/RunJobs $out/bin/RunJobs --set-default IHP_ASSET_VERSION $INPUT_HASH --set-default IHP_LIB ${ihpLibWithMakefileAndStatic}/lib/IHP --run "cd $out/lib" --prefix PATH : ${pkgs.lib.makeBinPath (otherDeps pkgs)}
+ fi;
- # Copy IHP Script binaries to bin/
- mkdir -p build/bin/Script
- find build/bin/Script/ -type f -print0 |
- while read -d $'\0' script; do
- script_basename=$(basename "$script")
- mv "build/bin/Script/$script_basename" "$out/bin/$script_basename";
- done
+ # Copy other binaries, excluding RunProdServer and RunJobs
+ find ${binaries}/bin/ -type f -not -name 'RunProdServer' -not -name 'RunJobs' -print0 |
+ while read -d $'\0' binary; do
+ binary_basename=$(basename "$binary")
+ cp "$binary" "$out/bin/$binary_basename";
+ done
- mv static "$out/lib/static"
+ mv static "$out/lib/static"
- runHook postInstall
+ runHook postInstall
'';
- dontFixup = true;
src = pkgs.nix-gitignore.gitignoreSource [] projectPath;
- buildInputs = builtins.concatLists [ [allHaskellPackages] allNativePackages ];
+ buildInputs = builtins.concatLists [ allNativePackages ];
nativeBuildInputs = builtins.concatLists [
- [ pkgs.makeWrapper
- pkgs.cacert # Needed for npm install to work from within the IHP build process
- ghc.ihp-ide # Needed for build-generated-code
- ]
- (if includeDevTools then [(pkgs.postgresql_13.withPackages postgresExtensions)] else [])
+ [ pkgs.makeWrapper
+ pkgs.cacert # Needed for npm install to work from within the IHP build process
+ [allHaskellPackages]
+ ]
];
shellHook = "eval $(egrep ^export ${allHaskellPackages}/bin/ghc)";
enableParallelBuilding = true;
impureEnvVars = pkgs.lib.fetchers.proxyImpureEnvVars; # Needed for npm install to work from within the IHP build process
- }
+ disallowedReferences = [ ihp ]; # Prevent including the large full IHP source code
+ }
\ No newline at end of file
diff --git a/NixSupport/haskell-packages/ihp-hsx.nix b/NixSupport/haskell-packages/ihp-hsx.nix
index 3e1559116..4e3b3d7ef 100644
--- a/NixSupport/haskell-packages/ihp-hsx.nix
+++ b/NixSupport/haskell-packages/ihp-hsx.nix
@@ -12,6 +12,7 @@
, haskell-src-meta
, containers
, unordered-containers
+, hspec
}:
mkDerivation {
pname = "ihp-hsx";
@@ -32,6 +33,7 @@ mkDerivation {
containers
unordered-containers
];
+ testHaskellDepends = [ hspec ];
license = lib.licenses.mit;
enableLibraryForGhci = true;
homepage = "https://ihp.digitallyinduced.com";
diff --git a/ProjectGenerator/bin/ihp-new b/ProjectGenerator/bin/ihp-new
index f85039bb6..106d03588 100755
--- a/ProjectGenerator/bin/ihp-new
+++ b/ProjectGenerator/bin/ihp-new
@@ -133,8 +133,8 @@ if ! [ -x "$(command -v direnv)" ]; then
if [ $DIRENV_HOOK_ADDED = 0 ]; then
echo ""
echo "Please add the hook manually."
- echo -e "Bash: Add \e[4meval \"\$(direnv hook bash)\"\e[0m to ~/.bashrc"
- echo -e "ZSH: Add \e[4meval \"\$(direnv hook zsh)\"\e[0m to ~/.zshrc"
+ echo -e "Bash: Add \e[4meval \"\$(direnv hook bash)\"\e[0m to ~/.bashrc or ~/.bash_profile"
+ echo -e "ZSH: Add \e[4meval \"\$(direnv hook zsh)\"\e[0m to ~/.zshrc or ~/.zprofile"
echo "Other shells: See https://direnv.net/#README"
echo -e "\e[1mHave you hooked direnv into your shell? (Type y to proceed) \e[0m"
while true; do
@@ -160,12 +160,12 @@ fi
DIRENV_SETUP=0;
case "${SHELL##*/}" in
zsh )
- if grep direnv "$HOME/.zshrc" "$HOME/.zprofile" 2>&1 >/dev/null; then
+ if grep -q -s direnv "$HOME/.zshrc" "$HOME/.zprofile"; then
DIRENV_SETUP=1;
fi;
;;
bash )
- if grep direnv "$HOME/.bashrc" "$HOME/.bash_profile" 2>&1 >/dev/null; then
+ if grep -q -s direnv "$HOME/.bashrc" "$HOME/.bash_profile"; then
DIRENV_SETUP=1;
fi;
;;
diff --git a/Test/FileStorage/ControllerFunctionsSpec.hs b/Test/FileStorage/ControllerFunctionsSpec.hs
new file mode 100644
index 000000000..02f6892a0
--- /dev/null
+++ b/Test/FileStorage/ControllerFunctionsSpec.hs
@@ -0,0 +1,70 @@
+module Test.FileStorage.ControllerFunctionsSpec where
+
+import Test.Hspec
+import IHP.Prelude
+import IHP.FileStorage.ControllerFunctions
+import IHP.Controller.Context
+import IHP.FrameworkConfig
+import Network.Wai as Wai (defaultRequest)
+import Network.Wai.Parse (FileInfo(..))
+import IHP.Controller.RequestContext
+import IHP.FileStorage.Types
+import IHP.FileStorage.Config
+
+tests :: Spec
+tests = describe "IHP.FileStorage.ControllerFunctions" $ do
+
+ let config :: ConfigBuilder
+ config = do
+ initStaticDirStorage
+
+ let withFrameworkConfig = IHP.FrameworkConfig.withFrameworkConfig config
+
+ describe "storeFileWithOptions" $ do
+ it "returns the objectPath without the baseUrl" $ do
+ withFrameworkConfig \frameworkConfig -> do
+ context <- createControllerContext frameworkConfig
+ let ?context = context
+
+ let fileInfo = FileInfo
+ { fileName = "test.txt"
+ , fileContentType = "text/plain"
+ , fileContent = "Hello, world!"
+ }
+
+ -- We pass the UUID that will be used as the filename, so we can easily assert the objectPath.
+ let options :: StoreFileOptions = def
+ { fileName = Just "4c55dac2-e411-45ac-aa10-b957b01221df"
+ , directory = "Test.FileStorage.ControllerFunctionsSpec"
+ }
+
+ result <- storeFileWithOptions fileInfo options
+
+ result.url `shouldBe` ("/Test.FileStorage.ControllerFunctionsSpec/4c55dac2-e411-45ac-aa10-b957b01221df")
+
+ describe "createTemporaryDownloadUrlFromPath" $ do
+ it "returns baseUrl concatenated with objectPath when objectPath does not start with http:// or https://" $ do
+ withFrameworkConfig \frameworkConfig -> do
+ context <- createControllerContext frameworkConfig
+ let ?context = context
+ let objectPath = "/static/test.txt"
+ temporaryDownloadUrl <- createTemporaryDownloadUrlFromPath objectPath
+
+ temporaryDownloadUrl.url `shouldBe` "http://localhost:8000/static/test.txt"
+
+ it "returns the objectPath when objectPath starts with 'http://' or 'https://'" $ do
+ withFrameworkConfig \frameworkConfig -> do
+ context <- createControllerContext frameworkConfig
+ let ?context = context
+ let objectPath = "https://example.com/static/test.txt"
+ temporaryDownloadUrl <- createTemporaryDownloadUrlFromPath objectPath
+
+ temporaryDownloadUrl.url `shouldBe` "https://example.com/static/test.txt"
+
+createControllerContext frameworkConfig = do
+ let
+ requestBody = FormBody { params = [], files = [] }
+ request = Wai.defaultRequest
+ requestContext = RequestContext { request, respond = error "respond", requestBody, frameworkConfig = frameworkConfig }
+ let ?requestContext = requestContext
+ newControllerContext
diff --git a/Test/HSX/ParserSpec.hs b/Test/HSX/ParserSpec.hs
deleted file mode 100644
index eb6849115..000000000
--- a/Test/HSX/ParserSpec.hs
+++ /dev/null
@@ -1,78 +0,0 @@
-{-|
-Module: Test.HSX.QQSpec
-Copyright: (c) digitally induced GmbH, 2020
--}
-module Test.HSX.ParserSpec where
-
-import Test.Hspec
-import IHP.Prelude
-import IHP.HSX.Parser
-import qualified Text.Megaparsec as Megaparsec
-import qualified Text.Megaparsec.Error as Megaparsec
-import qualified "template-haskell" Language.Haskell.TH as TH
-import qualified "template-haskell" Language.Haskell.TH.Syntax as TH
-
-tests = do
- let position = Megaparsec.SourcePos "" (Megaparsec.mkPos 1) (Megaparsec.mkPos 1)
- let extensions = []
- describe "HSX Parser" do
- it "should fail on invalid html tags" do
- let errorText = "1:13:\n |\n1 | \n | ^\nInvalid tag name: myinvalidel\n"
- let (Left error) = parseHsx position extensions ""
- (Megaparsec.errorBundlePretty error) `shouldBe` errorText
-
- it "should fail on invalid attribute names" do
- let errorText = "1:23:\n |\n1 |
\n | ^\nInvalid attribute name: invalid-attribute\n"
- let (Left error) = parseHsx position extensions "
"
- (Megaparsec.errorBundlePretty error) `shouldBe` errorText
-
- it "should fail on unmatched tags" do
- let errorText = "1:7:\n |\n1 |
\n | ^\nunexpected '/'\nexpecting \"
\", identifier, or white space\n"
- let (Left error) = parseHsx position extensions "
"
- (Megaparsec.errorBundlePretty error) `shouldBe` errorText
-
- it "should parse a closing tag with spaces" do
- let p = parseHsx position extensions ""
- p `shouldBe` (Right (Children [Node "div" [] [] False]))
-
- it "should strip spaces around nodes" do
- let p = parseHsx position extensions "
"
- p `shouldBe` (Right (Children [Node "div" [] [Node "span" [] [] False] False]))
-
- it "should strip spaces after self closing tags" do
- let p = parseHsx position extensions "{\"meta\"}\n\n "
- p `shouldBe` (Right (Children [Node "head" [] [SplicedNode (TH.LitE (TH.StringL "meta")),Node "link" [StaticAttribute "rel" (TextValue "stylesheet"),StaticAttribute "href" (TextValue "/vendor/bootstrap.min.css")] [] True] False]))
-
- it "should not strip spaces in a text node" do
- let p = parseHsx position extensions " Hello World "
- p `shouldBe` (Right (Children [TextNode "Hello World"]))
-
- it "should deal with variables in text nodes" do
- let p = parseHsx position extensions "
\n Hello {\"name\"}! \n
"
- p `shouldBe` (Right (Children [Node "div" [] [TextNode "Hello ",SplicedNode (TH.LitE (TH.StringL "name")),TextNode "!"] False]))
-
- it "should parse self closing tags with spaces around it" do
- let p = parseHsx position extensions " "
- p `shouldBe` (Right (Children [Node "div" [] [] False]))
-
- it "should collapse spaces" do
- let p = parseHsx position extensions "\n Hello\n World\n ! "
- p `shouldBe` (Right (Children [TextNode "Hello World !"]))
-
- it "should parse spread values" do
- let p = parseHsx position extensions ""
- -- We cannot easily construct the @VarE variables@ expression, therefore we use show here for comparison
- tshow p `shouldBe` "Right (Children [Node \"div\" [SpreadAttributes (VarE variables)] [] False])"
-
- it "should parse spread values with a space" do
- -- See https://github.com/digitallyinduced/ihp/issues/1588
- let p = parseHsx position extensions ""
- tshow p `shouldBe` "Right (Children [Node \"div\" [SpreadAttributes (VarE variables)] [] False])"
-
- it "should accept underscores in data attributes" do
- let p = parseHsx position extensions "
"
- p `shouldBe` (Right (Children [Node "div" [StaticAttribute "data-client_id" (TextValue "test")] [] False]))
-
- it "should accept doctype" do
- let p = parseHsx position extensions "hello"
- p `shouldBe` (Right (Children [Node "!DOCTYPE" [StaticAttribute "html" (TextValue "html")] [] True, Node "html" [StaticAttribute "lang" (TextValue "en")] [Node "body" [] [TextNode "hello"] False] False]))
\ No newline at end of file
diff --git a/Test/IDE/CodeGeneration/MigrationGenerator.hs b/Test/IDE/CodeGeneration/MigrationGenerator.hs
index abe4ad431..b4b87cfa0 100644
--- a/Test/IDE/CodeGeneration/MigrationGenerator.hs
+++ b/Test/IDE/CodeGeneration/MigrationGenerator.hs
@@ -1362,6 +1362,17 @@ CREATE POLICY "Users can read and edit their own record" ON public.users USING (
diffSchemas targetSchema actualSchema `shouldBe` migration
+ it "should ignore ar_did_update_.. triggers by IHP.AutoRefresh" do
+ let actualSchema = sql $ cs [plain|
+ CREATE TRIGGER ar_did_update_plans AFTER UPDATE ON public.plans FOR EACH ROW EXECUTE FUNCTION public.notify_did_change_plans();
+ CREATE TRIGGER ar_did_insert_offices AFTER INSERT ON public.offices FOR EACH STATEMENT EXECUTE FUNCTION public.notify_did_change_offices();
+ CREATE TRIGGER ar_did_delete_company_profiles AFTER DELETE ON public.company_profiles FOR EACH STATEMENT EXECUTE FUNCTION public.notify_did_change_company_profiles();
+ |]
+ let targetSchema = []
+ let migration = []
+
+ diffSchemas targetSchema actualSchema `shouldBe` migration
+
it "should deal with truncated identifiers" do
let actualSchema = sql $ cs [plain|
CREATE POLICY "Users can manage the prepare_context_jobs if they can see the C" ON public.prepare_context_jobs USING ((EXISTS ( SELECT 1
diff --git a/Test/IDE/CodeGeneration/ViewGenerator.hs b/Test/IDE/CodeGeneration/ViewGenerator.hs
index c229172f9..4c96bbdfd 100644
--- a/Test/IDE/CodeGeneration/ViewGenerator.hs
+++ b/Test/IDE/CodeGeneration/ViewGenerator.hs
@@ -35,7 +35,8 @@ tests = do
}
]
it "should build a view with name \"EditView\"" do
- let viewName = "EditView"
+ let rawViewName = "EditView"
+ let viewName = tableNameToViewName rawViewName
let rawControllerName = "Pages"
let controllerName = tableNameToControllerName rawControllerName
let modelName = tableNameToModelName rawControllerName
@@ -50,8 +51,41 @@ tests = do
+ it "should build a view with name \"edit_view\"" do
+ let rawViewName = "edit_view"
+ let viewName = tableNameToViewName rawViewName
+ let rawControllerName = "Pages"
+ let controllerName = tableNameToControllerName rawControllerName
+ let modelName = tableNameToModelName rawControllerName
+ let applicationName = "Web"
+ let paginationEnabled = False
+ let config = ViewGenerator.ViewConfig { .. }
+ let builtPlan = ViewGenerator.buildPlan' schema config
+
+ builtPlan `shouldBe`
+ [ EnsureDirectory {directory = "Web/View/Pages"},CreateFile {filePath = "Web/View/Pages/Edit.hs", fileContent = "module Web.View.Pages.Edit where\nimport Web.View.Prelude\n\ndata EditView = EditView { page :: Page }\n\ninstance View EditView where\n html EditView { .. } = [hsx|\n {breadcrumb}\n
Edit Page
\n {renderForm page}\n |]\n where\n breadcrumb = renderBreadcrumb\n [ breadcrumbLink \"Pages\" PagesAction\n , breadcrumbText \"Edit Page\"\n ]\n\nrenderForm :: Page -> Html\nrenderForm page = formFor page [hsx|\n \n {submitButton}\n\n|]"},AddImport {filePath = "Web/Controller/Pages.hs", fileContent = "import Web.View.Pages.Edit"}
+ ]
+
+
+ it "should build a view with name \"editView\"" do
+ let rawViewName = "editView"
+ let viewName = tableNameToViewName rawViewName
+ let rawControllerName = "Pages"
+ let controllerName = tableNameToControllerName rawControllerName
+ let modelName = tableNameToModelName rawControllerName
+ let applicationName = "Web"
+ let paginationEnabled = False
+ let config = ViewGenerator.ViewConfig { .. }
+ let builtPlan = ViewGenerator.buildPlan' schema config
+
+ builtPlan `shouldBe`
+ [ EnsureDirectory {directory = "Web/View/Pages"},CreateFile {filePath = "Web/View/Pages/Edit.hs", fileContent = "module Web.View.Pages.Edit where\nimport Web.View.Prelude\n\ndata EditView = EditView { page :: Page }\n\ninstance View EditView where\n html EditView { .. } = [hsx|\n {breadcrumb}\n
Edit Page
\n {renderForm page}\n |]\n where\n breadcrumb = renderBreadcrumb\n [ breadcrumbLink \"Pages\" PagesAction\n , breadcrumbText \"Edit Page\"\n ]\n\nrenderForm :: Page -> Html\nrenderForm page = formFor page [hsx|\n \n {submitButton}\n\n|]"},AddImport {filePath = "Web/Controller/Pages.hs", fileContent = "import Web.View.Pages.Edit"}
+ ]
+
+
it "should build a view with name \"Edit\"" do
- let viewName = "Edit"
+ let rawViewName = "Edit"
+ let viewName = tableNameToViewName rawViewName
let rawControllerName = "Pages"
let controllerName = tableNameToControllerName rawControllerName
let modelName = tableNameToModelName rawControllerName
@@ -66,7 +100,8 @@ tests = do
it "should build a view with name \"Test\"" do
- let viewName = "Test"
+ let rawViewName = "Test"
+ let viewName = tableNameToViewName rawViewName
let rawControllerName = "Pages"
let controllerName = tableNameToControllerName rawControllerName
let modelName = tableNameToModelName rawControllerName
@@ -76,5 +111,5 @@ tests = do
let builtPlan = ViewGenerator.buildPlan' schema config
builtPlan `shouldBe`
- [ EnsureDirectory {directory = "Web/View/Pages"},CreateFile {filePath = "Web/View/Pages/Test.hs", fileContent = "module Web.View.Pages.Test where\nimport Web.View.Prelude\ndata TestView = {TestView}\n\ninstance View TestView where\n html TestView { .. } = [hsx|\n {breadcrumb}\n
} queryBuilder A database query
- */
- function useQuery
(queryBuilder: QueryBuilder
): Array;
-}
|]
where
tableNameTypeDef :: Text
diff --git a/ihp-datasync-typescript/Test/Spec.hs b/ihp-datasync-typescript/Test/Spec.hs
index d08b06e52..e9061dcb2 100644
--- a/ihp-datasync-typescript/Test/Spec.hs
+++ b/ihp-datasync-typescript/Test/Spec.hs
@@ -383,28 +383,6 @@ tests = do
}
function ThinBackend(props: ThinBackendProps): JSX.Element;
}
-
- declare module 'ihp-datasync/react18' {
- import { TableName, QueryBuilder } from 'ihp-datasync';
-
- /**
- * React hook for querying the database and streaming results in real-time.
- *
- * Suspends while the data is being fetched from the server.
- *
- * @example
- * function TasksList() {
- * const tasks = useQuery(query('tasks').orderBy('createdAt'))
- *
- * return
- * {tasks.map(task =>
{task.title}
)}
- *
- * }
- *
- * @param {QueryBuilder
} queryBuilder A database query
- */
- function useQuery
(queryBuilder: QueryBuilder
): Array;
- }
|]
(generateTypeScriptTypeDefinitions schema) `shouldBe` expected
diff --git a/ihp-graphql/IHP/GraphQL/SchemaCompiler.hs b/ihp-graphql/IHP/GraphQL/SchemaCompiler.hs
index dd0b48692..e2035ce20 100644
--- a/ihp-graphql/IHP/GraphQL/SchemaCompiler.hs
+++ b/ihp-graphql/IHP/GraphQL/SchemaCompiler.hs
@@ -213,4 +213,5 @@ postgresTypeToGraphQLType PInet = NamedType "IPv4"
postgresTypeToGraphQLType PTSVector = NamedType "String"
postgresTypeToGraphQLType (PArray type_) = ListType (postgresTypeToGraphQLType type_)
postgresTypeToGraphQLType PTrigger = error "Trigger cannot be converted to a GraphQL type"
+postgresTypeToGraphQLType PEventTrigger = error "Trigger cannot be converted to a GraphQL type"
postgresTypeToGraphQLType (PCustomType theType) = NamedType "String"
\ No newline at end of file
diff --git a/ihp-hsx/IHP/HSX/Parser.hs b/ihp-hsx/IHP/HSX/Parser.hs
index 1f50d1d5c..2c1eac21e 100644
--- a/ihp-hsx/IHP/HSX/Parser.hs
+++ b/ihp-hsx/IHP/HSX/Parser.hs
@@ -15,6 +15,7 @@ module IHP.HSX.Parser
, Attribute (..)
, AttributeValue (..)
, collapseSpace
+, HsxSettings (..)
) where
import Prelude
@@ -34,6 +35,12 @@ import qualified Data.Set as Set
import qualified Data.Containers.ListUtils as List
import qualified IHP.HSX.HaskellParser as HaskellParser
+data HsxSettings = HsxSettings
+ { checkMarkup :: Bool
+ , additionalTagNames :: Set Text
+ , additionalAttributeNames :: Set Text
+ }
+
data AttributeValue = TextValue !Text | ExpressionValue !Haskell.Exp deriving (Eq, Show)
data Attribute = StaticAttribute !Text !AttributeValue | SpreadAttributes !Haskell.Exp deriving (Eq, Show)
@@ -57,15 +64,16 @@ data Node = Node !Text ![Attribute] ![Node] !Bool
-- > let position = Megaparsec.SourcePos filePath (Megaparsec.mkPos line) (Megaparsec.mkPos col)
-- > let hsxText = "Hello"
-- >
--- > let (Right node) = parseHsx position [] hsxText
-parseHsx :: SourcePos -> [TH.Extension] -> Text -> Either (ParseErrorBundle Text Void) Node
-parseHsx position extensions code =
+-- > let (Right node) = parseHsx settings position [] hsxText
+parseHsx :: HsxSettings -> SourcePos -> [TH.Extension] -> Text -> Either (ParseErrorBundle Text Void) Node
+parseHsx settings position extensions code =
let
?extensions = extensions
+ ?settings = settings
in
runParser (setPosition position *> parser) "" code
-type Parser a = (?extensions :: [TH.Extension]) => Parsec Void Text a
+type Parser a = (?extensions :: [TH.Extension], ?settings :: HsxSettings) => Parsec Void Text a
setPosition pstateSourcePos = updateParserState (\state -> state {
statePosState = (statePosState state) { pstateSourcePos }
@@ -211,15 +219,16 @@ hsxNodeAttribute = do
hsxAttributeName :: Parser Text
hsxAttributeName = do
name <- rawAttribute
- unless (isValidAttributeName name) (fail $ "Invalid attribute name: " <> cs name)
+ let checkingMarkup = ?settings.checkMarkup
+ unless (isValidAttributeName name || not checkingMarkup) (fail $ "Invalid attribute name: " <> cs name)
pure name
where
isValidAttributeName name =
"data-" `Text.isPrefixOf` name
|| "aria-" `Text.isPrefixOf` name
|| "hx-" `Text.isPrefixOf` name
- || "hx-" `Text.isPrefixOf` name
|| name `Set.member` attributes
+ || name `Set.member` ?settings.additionalAttributeNames
rawAttribute = takeWhile1P Nothing (\c -> Char.isAlphaNum c || c == '-' || c == '_')
@@ -285,13 +294,18 @@ hsxSplicedNode = do
treeToString acc (TokenNode (x:xs)) = ((treeToString (acc <> "{") x) <> (Text.concat $ fmap (treeToString "") xs)) <> "}"
+
hsxElementName :: Parser Text
hsxElementName = do
name <- takeWhile1P (Just "identifier") (\c -> Char.isAlphaNum c || c == '_' || c == '-' || c == '!')
let isValidParent = name `Set.member` parents
let isValidLeaf = name `Set.member` leafs
- let isValidCustomWebComponent = "-" `Text.isInfixOf` name
- unless (isValidParent || isValidLeaf || isValidCustomWebComponent) (fail $ "Invalid tag name: " <> cs name)
+ let isValidCustomWebComponent = "-" `Text.isInfixOf` name
+ && not (Text.isPrefixOf "-" name)
+ && not (Char.isNumber (Text.head name))
+ let isValidAdditionalTag = name `Set.member` ?settings.additionalTagNames
+ let checkingMarkup = ?settings.checkMarkup
+ unless (isValidParent || isValidLeaf || isValidCustomWebComponent || isValidAdditionalTag || not checkingMarkup) (fail $ "Invalid tag name: " <> cs name)
space
pure name
diff --git a/ihp-hsx/IHP/HSX/QQ.hs b/ihp-hsx/IHP/HSX/QQ.hs
index a064f2fb4..4599c7ff2 100644
--- a/ihp-hsx/IHP/HSX/QQ.hs
+++ b/ihp-hsx/IHP/HSX/QQ.hs
@@ -5,7 +5,7 @@ Module: IHP.HSX.QQ
Description: Defines the @[hsx||]@ syntax
Copyright: (c) digitally induced GmbH, 2022
-}
-module IHP.HSX.QQ (hsx) where
+module IHP.HSX.QQ (hsx, uncheckedHsx, customHsx) where
import Prelude
import Data.Text (Text)
@@ -27,20 +27,40 @@ import Data.List (foldl')
import IHP.HSX.Attribute
import qualified Text.Blaze.Html5.Attributes as Attributes
import qualified Data.HashMap.Strict as HashMap
+import qualified Data.Set as Set
hsx :: QuasiQuoter
-hsx = QuasiQuoter {
- quoteExp = quoteHsxExpression,
- quotePat = error "quotePat: not defined",
- quoteDec = error "quoteDec: not defined",
- quoteType = error "quoteType: not defined"
- }
+hsx = customHsx
+ (HsxSettings
+ { checkMarkup = True
+ , additionalTagNames = Set.empty
+ , additionalAttributeNames = Set.empty
+ }
+ )
-quoteHsxExpression :: String -> TH.ExpQ
-quoteHsxExpression code = do
+uncheckedHsx :: QuasiQuoter
+uncheckedHsx = customHsx
+ (HsxSettings
+ { checkMarkup = False
+ , additionalTagNames = Set.empty
+ , additionalAttributeNames = Set.empty
+ }
+ )
+
+customHsx :: HsxSettings -> QuasiQuoter
+customHsx settings =
+ QuasiQuoter
+ { quoteExp = quoteHsxExpression settings
+ , quotePat = error "quotePat: not defined"
+ , quoteDec = error "quoteDec: not defined"
+ , quoteType = error "quoteType: not defined"
+ }
+
+quoteHsxExpression :: HsxSettings -> String -> TH.ExpQ
+quoteHsxExpression settings code = do
hsxPosition <- findHSXPosition
extensions <- TH.extsEnabled
- expression <- case parseHsx hsxPosition extensions (cs code) of
+ expression <- case parseHsx settings hsxPosition extensions (cs code) of
Left error -> fail (Megaparsec.errorBundlePretty error)
Right result -> pure result
compileToHaskell expression
@@ -458,4 +478,4 @@ textToStaticString text = StaticString (Text.unpack text ++) (Text.encodeUtf8 te
{-# INLINE textToStaticString #-}
instance Show (MarkupM ()) where
- show html = BlazeString.renderHtml html
+ show html = BlazeString.renderHtml html
\ No newline at end of file
diff --git a/ihp-hsx/README.md b/ihp-hsx/README.md
index 20fcc3a4f..d576bbbc2 100644
--- a/ihp-hsx/README.md
+++ b/ihp-hsx/README.md
@@ -274,6 +274,102 @@ The underlying HTML library blaze currently does not support an empty HTML attri
If you use HTML entities, such as ` ` for a non-breaking space, you will notice they appear exactly like that. To output directly (i.e. unescaped) use the method `preEscapedToMarkup` from `Text.Blaze.Html5`.
+### Custom HSX and Unchecked HSX
+
+HSX provides two additional QuasiQuoters beyond the standard `[hsx|...|]` for increased flexibility: `uncheckedHsx` and `customHsx`.
+
+#### Using `uncheckedHsx`
+
+`uncheckedHsx` provides a quick way to bypass HSX's strict tag and attribute name checking.
+
+It will still check for a valid HTML structure, but it will accept any tag and attribute names.
+
+
+```haskell
+[uncheckedHsx|
+
+ Content
+
+|]
+```
+
+While convenient for rapid development, use it with caution as you lose the benefits of compile-time guarantees for your markup.
+
+#### Using `customHsx`
+
+`customHsx` allows you to extend the default HSX with additional whitelisted tag names and attribute names while maintaining the same strict compile-time checking of the default `hsx`.
+
+This makes it easier to use custom elements that often also contain special attributes, and javascript libraries, for example `_hyperscript`, that use the `_` as an attribute name.
+
+
+To use `customHsx`, you need to create it in a separate module due to Template Haskell restrictions. Here's how to set it up:
+
+1. First, create a new module for your custom HSX (e.g., `Application.Helper.CustomHsx`):
+
+```haskell
+module Application.Helper.CustomHsx where
+
+import IHP.Prelude
+import IHP.HSX.QQ (customHsx)
+import IHP.HSX.Parser
+import Language.Haskell.TH.Quote
+import qualified Data.Set as Set
+
+myHsx :: QuasiQuoter
+myHsx = customHsx
+ (HsxSettings
+ { checkMarkup = True
+ , additionalTagNames = Set.fromList ["book", "heading", "name"]
+ , additionalAttributeNames = Set.fromList ["_", "custom-attribute"]
+ }
+ )
+```
+
+Configuration options for `HsxSettings`:
+- `checkMarkup`: Boolean to enable/disable markup checking
+- `additionalTagNames`: Set of additional allowed tag names
+- `additionalAttributeNames`: Set of additional allowed attribute names
+
+2. Make it available in your views by adding it to your view helpers module:
+
+```haskell
+module Application.Helper.View (
+ module Application.Helper.View,
+ module Application.Helper.CustomHsx -- Add this line
+) where
+
+import IHP.ViewPrelude
+import Application.Helper.CustomHsx (myHsx) -- Add this line
+```
+
+3. Use it in your views:
+
+```haskell
+[myHsx|
+
+ My Book
+ Author Name
+
+|]
+```
+
+The custom HSX will validate that tags and attributes are either in the default HSX whitelist or in your additional sets. This gives you the flexibility to use custom elements and attributes.
+
+This approach is particularly useful for:
+- Web Components with custom attribute names
+- UI libraries with non-standard attributes
+- Domain-specific XML markup languages
+- Integration with third-party frameworks that extend HTML syntax
+
+`customHsx` whitelisting and even `uncheckedHsx` does not entirely help for libraries with very unusual symbols in their attributes, like Alpine.js, because they don't recognize html attributes starting with `@` or has `:` in the attribute name. In these cases, the spread syntax `{...attributeList}` is likely your best bet.
+
+```haskell
+-- This will not work
+[uncheckedHsx||]
+
+-- Using spread syntax will work
+[hsx||]
+```
## Common HSX Patterns
diff --git a/ihp-hsx/Test/IHP/HSX/CustomHsxCases.hs b/ihp-hsx/Test/IHP/HSX/CustomHsxCases.hs
new file mode 100644
index 000000000..af39fcaf2
--- /dev/null
+++ b/ihp-hsx/Test/IHP/HSX/CustomHsxCases.hs
@@ -0,0 +1,38 @@
+{-|
+Module: Test.HSX.CustomHsxCases
+Description: Test helpers for HSX tests with custom tags and attributes
+-}
+module IHP.HSX.CustomHsxCases where
+
+import Test.Hspec
+import Prelude
+import IHP.HSX.QQ
+import qualified Text.Blaze.Renderer.Text as Blaze
+import Data.Text
+import Language.Haskell.TH.Quote
+import IHP.HSX.Parser
+import qualified Data.Set as Set
+
+myCustomHsx :: QuasiQuoter
+myCustomHsx = customHsx
+ (HsxSettings { checkMarkup = True
+ , additionalTagNames = Set.fromList ["mycustomtag", "anothercustomtag"]
+ , additionalAttributeNames = Set.fromList ["my-custom-attr", "anothercustomattr"]
+ }
+ )
+
+myTagsOnlyHsx :: QuasiQuoter
+myTagsOnlyHsx = customHsx
+ (HsxSettings { checkMarkup = True
+ , additionalTagNames = Set.fromList ["mycustomtag", "anothercustomtag"]
+ , additionalAttributeNames = Set.fromList []
+ }
+ )
+
+myAttrsOnlyHsx :: QuasiQuoter
+myAttrsOnlyHsx = customHsx
+ (HsxSettings { checkMarkup = True
+ , additionalTagNames = Set.fromList []
+ , additionalAttributeNames = Set.fromList ["my-custom-attr", "anothercustomattr"]
+ }
+ )
diff --git a/ihp-hsx/Test/IHP/HSX/ParserSpec.hs b/ihp-hsx/Test/IHP/HSX/ParserSpec.hs
new file mode 100644
index 000000000..8087828a5
--- /dev/null
+++ b/ihp-hsx/Test/IHP/HSX/ParserSpec.hs
@@ -0,0 +1,172 @@
+{-|
+Module: IHP.HSX.QQSpec
+Copyright: (c) digitally induced GmbH, 2020
+-}
+module IHP.HSX.ParserSpec where
+
+import Test.Hspec
+import Prelude
+import IHP.HSX.Parser
+import qualified Text.Megaparsec as Megaparsec
+import qualified Text.Megaparsec.Error as Megaparsec
+import qualified "template-haskell" Language.Haskell.TH as TH
+import qualified "template-haskell" Language.Haskell.TH.Syntax as TH
+import qualified Data.Set as Set
+
+
+tests = do
+ let position = Megaparsec.SourcePos "" (Megaparsec.mkPos 1) (Megaparsec.mkPos 1)
+ let extensions = []
+
+ describe "HSX Parser" do
+ let settings = HsxSettings True Set.empty Set.empty
+ it "should fail on invalid html tags" do
+ let errorText = "1:13:\n |\n1 | \n | ^\nInvalid tag name: myinvalidel\n"
+ let (Left error) = parseHsx settings position extensions ""
+ (Megaparsec.errorBundlePretty error) `shouldBe` errorText
+
+ it "should fail on invalid attribute names" do
+ let errorText = "1:23:\n |\n1 |
\n | ^\nInvalid attribute name: invalid-attribute\n"
+ let (Left error) = parseHsx settings position extensions "
"
+ (Megaparsec.errorBundlePretty error) `shouldBe` errorText
+
+ it "should fail on unmatched tags" do
+ let errorText = "1:7:\n |\n1 |
\n | ^\nunexpected '/'\nexpecting \"
\", identifier, or white space\n"
+ let (Left error) = parseHsx settings position extensions "
"
+ (Megaparsec.errorBundlePretty error) `shouldBe` errorText
+
+ it "should parse a closing tag with spaces" do
+ let p = parseHsx settings position extensions ""
+ p `shouldBe` (Right (Children [Node "div" [] [] False]))
+
+ it "should strip spaces around nodes" do
+ let p = parseHsx settings position extensions "
"
+ p `shouldBe` (Right (Children [Node "div" [] [Node "span" [] [] False] False]))
+
+ it "should strip spaces after self closing tags" do
+ let p = parseHsx settings position extensions "{\"meta\"}\n\n "
+ p `shouldBe` (Right (Children [Node "head" [] [SplicedNode (TH.LitE (TH.StringL "meta")),Node "link" [StaticAttribute "rel" (TextValue "stylesheet"),StaticAttribute "href" (TextValue "/vendor/bootstrap.min.css")] [] True] False]))
+
+ it "should not strip spaces in a text node" do
+ let p = parseHsx settings position extensions " Hello World "
+ p `shouldBe` (Right (Children [TextNode "Hello World"]))
+
+ it "should deal with variables in text nodes" do
+ let p = parseHsx settings position extensions "
\n Hello {\"name\"}! \n
"
+ p `shouldBe` (Right (Children [Node "div" [] [TextNode "Hello ",SplicedNode (TH.LitE (TH.StringL "name")),TextNode "!"] False]))
+
+ it "should parse self closing tags with spaces around it" do
+ let p = parseHsx settings position extensions " "
+ p `shouldBe` (Right (Children [Node "div" [] [] False]))
+
+ it "should collapse spaces" do
+ let p = parseHsx settings position extensions "\n Hello\n World\n ! "
+ p `shouldBe` (Right (Children [TextNode "Hello World !"]))
+
+ it "should parse spread values" do
+ let p = parseHsx settings position extensions ""
+ -- We cannot easily construct the @VarE variables@ expression, therefore we use show here for comparison
+ show p `shouldBe` "Right (Children [Node \"div\" [SpreadAttributes (VarE variables)] [] False])"
+
+ it "should parse spread values with a space" do
+ -- See https://github.com/digitallyinduced/ihp/issues/1588
+ let p = parseHsx settings position extensions ""
+ show p `shouldBe` "Right (Children [Node \"div\" [SpreadAttributes (VarE variables)] [] False])"
+
+ it "should accept underscores in data attributes" do
+ let p = parseHsx settings position extensions "
"
+ p `shouldBe` (Right (Children [Node "div" [StaticAttribute "data-client_id" (TextValue "test")] [] False]))
+
+ it "should accept doctype" do
+ let p = parseHsx settings position extensions "hello"
+ p `shouldBe` (Right (Children [Node "!DOCTYPE" [StaticAttribute "html" (TextValue "html")] [] True, Node "html" [StaticAttribute "lang" (TextValue "en")] [Node "body" [] [TextNode "hello"] False] False]))
+
+ describe "uncheckedHsx" do
+ let settings = HsxSettings False Set.empty Set.empty
+ it "should not check markup" do
+ let p = parseHsx settings position extensions ""
+ p `shouldBe` (Right (Children [Node "invalid-tag" [StaticAttribute "invalid-attribute" (TextValue "invalid")] [] False]))
+
+ it "should not check attribute names" do
+ let p = parseHsx settings position extensions "
"
+ p `shouldBe` (Right (Children [Node "div" [StaticAttribute "invalid-attribute" (TextValue "invalid")] [] False]))
+
+ it "should fail on unmatched tags" do
+ let errorText = "1:7:\n |\n1 |
\n | ^\nunexpected '/'\nexpecting \"
\", identifier, or white space\n"
+ let (Left error) = parseHsx settings position extensions "
"
+ (Megaparsec.errorBundlePretty error) `shouldBe` errorText
+
+ it "should parse a closing tag with spaces" do
+ let p = parseHsx settings position extensions ""
+ p `shouldBe` (Right (Children [Node "div" [] [] False]))
+
+ it "should strip spaces around nodes" do
+ let p = parseHsx settings position extensions "
"
+ p `shouldBe` (Right (Children [Node "div" [] [Node "span" [] [] False] False]))
+
+ it "should strip spaces after self closing tags" do
+ let p = parseHsx settings position extensions "{\"meta\"}\n\n "
+ p `shouldBe` (Right (Children [Node "head" [] [SplicedNode (TH.LitE (TH.StringL "meta")),Node "link" [StaticAttribute "rel" (TextValue "stylesheet"),StaticAttribute "href" (TextValue "/vendor/bootstrap.min.css")] [] True] False]))
+
+ it "should not strip spaces in a text node" do
+ let p = parseHsx settings position extensions " Hello World "
+ p `shouldBe` (Right (Children [TextNode "Hello World"]))
+
+ it "should deal with variables in text nodes" do
+ let p = parseHsx settings position extensions "
\n Hello {\"name\"}! \n
"
+ p `shouldBe` (Right (Children [Node "div" [] [TextNode "Hello ",SplicedNode (TH.LitE (TH.StringL "name")),TextNode "!"] False]))
+
+ it "should parse self closing tags with spaces around it" do
+ let p = parseHsx settings position extensions " "
+ p `shouldBe` (Right (Children [Node "div" [] [] False]))
+
+ it "should collapse spaces" do
+ let p = parseHsx settings position extensions "\n Hello\n World\n ! "
+ p `shouldBe` (Right (Children [TextNode "Hello World !"]))
+
+ it "should parse spread values" do
+ let p = parseHsx settings position extensions ""
+ -- We cannot easily construct the @VarE variables@ expression, therefore we use show here for comparison
+ show p `shouldBe` "Right (Children [Node \"div\" [SpreadAttributes (VarE variables)] [] False])"
+
+ it "should parse spread values with a space" do
+ -- See https://github.com/digitallyinduced/ihp/issues/1588
+ let p = parseHsx settings position extensions ""
+ show p `shouldBe` "Right (Children [Node \"div\" [SpreadAttributes (VarE variables)] [] False])"
+
+ it "should accept underscores in data attributes" do
+ let p = parseHsx settings position extensions "
"
+ p `shouldBe` (Right (Children [Node "div" [StaticAttribute "data-client_id" (TextValue "test")] [] False]))
+
+ it "should accept doctype" do
+ let p = parseHsx settings position extensions "hello"
+ p `shouldBe` (Right (Children [Node "!DOCTYPE" [StaticAttribute "html" (TextValue "html")] [] True, Node "html" [StaticAttribute "lang" (TextValue "en")] [Node "body" [] [TextNode "hello"] False] False]))
+
+ describe "customHsx" do
+ let customSettings = HsxSettings True
+ (Set.fromList ["mycustomtag"])
+ (Set.fromList ["my-custom-attr"])
+
+ it "should allow specified custom tags" do
+ let p = parseHsx customSettings position extensions "hello"
+ p `shouldBe` (Right (Children [Node "mycustomtag" [] [TextNode "hello"] False]))
+
+ it "should reject non-specified custom tags" do
+ let errorText = "1:15:\n |\n1 | hello\n | ^\nInvalid tag name: notallowedtag\n"
+ case parseHsx customSettings position extensions "hello" of
+ Left error -> (Megaparsec.errorBundlePretty error) `shouldBe` errorText
+ Right _ -> fail "Expected parser to fail with invalid tag name"
+
+ it "should allow specified custom attributes" do
+ let p = parseHsx customSettings position extensions "
test
"
+ p `shouldBe` (Right (Children [Node "div" [StaticAttribute "my-custom-attr" (TextValue "hello")] [TextNode "test"] False]))
+
+ it "should reject non-specified custom attributes" do
+ let errorText = "1:22:\n |\n1 |
\n | ^\nInvalid attribute name: not-allowed-attr\n"
+ case parseHsx customSettings position extensions "
" of
+ Left error -> (Megaparsec.errorBundlePretty error) `shouldBe` errorText
+ Right _ -> fail "Expected parser to fail with invalid attribute name"
+
+ it "should allow mixing custom and standard elements" do
+ let p = parseHsx customSettings position extensions "test"
+ p `shouldBe` (Right (Children [Node "mycustomtag" [StaticAttribute "class" (TextValue "hello"), StaticAttribute "my-custom-attr" (TextValue "world")] [TextNode "test"] False]))
\ No newline at end of file
diff --git a/Test/HSX/QQSpec.hs b/ihp-hsx/Test/IHP/HSX/QQSpec.hs
similarity index 88%
rename from Test/HSX/QQSpec.hs
rename to ihp-hsx/Test/IHP/HSX/QQSpec.hs
index 5f69648c7..916a71ed3 100644
--- a/Test/HSX/QQSpec.hs
+++ b/ihp-hsx/Test/IHP/HSX/QQSpec.hs
@@ -1,15 +1,18 @@
{-|
-Module: Test.HSX.QQSpec
+Module: IHP.HSX.QQSpec
Copyright: (c) digitally induced GmbH, 2020
-}
-module Test.HSX.QQSpec where
+module IHP.HSX.QQSpec where
import Test.Hspec
-import IHP.Prelude
+import Prelude
import IHP.HSX.QQ
import qualified Text.Blaze.Renderer.Text as Blaze
import Text.Blaze (preEscapedTextValue)
+import Data.Text
+import IHP.HSX.CustomHsxCases
+tests :: SpecWith ()
tests = do
describe "HSX" do
it "should work with static html" do
@@ -189,6 +192,21 @@ tests = do
[hsx|hello|] `shouldBeHtml` "\nhello"
+ describe "customHsx" do
+ it "should allow specified custom tags" do
+ [myTagsOnlyHsx|hello|] `shouldBeHtml` "hello"
+ [myTagsOnlyHsx|world|] `shouldBeHtml` "world"
+
+ it "should allow specified custom attributes" do
+ [myAttrsOnlyHsx|
test
|] `shouldBeHtml` "
test
"
+ [myAttrsOnlyHsx|
test
|] `shouldBeHtml` "
test
"
+
+ it "should allow combining custom tags and attributes" do
+ [myCustomHsx|test|] `shouldBeHtml` "test"
+
+ it "should work with regular HTML tags and attributes too" do
+ [myCustomHsx|
world
|] `shouldBeHtml` "
world
"
+
data Project = Project { name :: Text }
data PlaceId = PlaceId Text
diff --git a/ihp-hsx/Test/Main.hs b/ihp-hsx/Test/Main.hs
new file mode 100644
index 000000000..288e12321
--- /dev/null
+++ b/ihp-hsx/Test/Main.hs
@@ -0,0 +1,12 @@
+module Main where
+
+import Prelude
+
+import Test.Hspec
+import qualified IHP.HSX.QQSpec
+import qualified IHP.HSX.ParserSpec
+
+main :: IO ()
+main = hspec do
+ IHP.HSX.QQSpec.tests
+ IHP.HSX.ParserSpec.tests
\ No newline at end of file
diff --git a/ihp-hsx/ihp-hsx.cabal b/ihp-hsx/ihp-hsx.cabal
index 0fd3ef0f0..8ee565b4b 100644
--- a/ihp-hsx/ihp-hsx.cabal
+++ b/ihp-hsx/ihp-hsx.cabal
@@ -1,6 +1,6 @@
cabal-version: 2.2
name: ihp-hsx
-version: 1.3.0
+version: 1.4.0
synopsis: JSX-like but for Haskell
description: JSX-like templating syntax for Haskell
license: MIT
@@ -88,4 +88,57 @@ library
, IHP.HSX.ConvertibleStrings
, IHP.HSX.HaskellParser
, IHP.HSX.HsExpToTH
- , IHP.HSX.Attribute
\ No newline at end of file
+ , IHP.HSX.Attribute
+
+test-suite ihp-hsx-tests
+ type: exitcode-stdio-1.0
+ hs-source-dirs: Test
+ main-is: Main.hs
+ other-modules:
+ IHP.HSX.ParserSpec
+ IHP.HSX.QQSpec
+ default-language: Haskell2010
+ build-depends:
+ base >= 4.17.0 && < 4.20
+ , ihp-hsx
+ , hspec
+ , text >= 2.0.1 && < 2.2
+ , bytestring >= 0.11.3 && < 0.13
+ , containers >= 0.6.6 && < 0.7
+ , blaze-markup
+ , megaparsec
+ , template-haskell
+ default-extensions:
+ OverloadedStrings
+ , NoImplicitPrelude
+ , ImplicitParams
+ , Rank2Types
+ , NamedFieldPuns
+ , TypeSynonymInstances
+ , FlexibleInstances
+ , DisambiguateRecordFields
+ , DuplicateRecordFields
+ , OverloadedLabels
+ , FlexibleContexts
+ , DataKinds
+ , QuasiQuotes
+ , TypeFamilies
+ , PackageImports
+ , ScopedTypeVariables
+ , RecordWildCards
+ , TypeApplications
+ , DataKinds
+ , InstanceSigs
+ , DeriveGeneric
+ , MultiParamTypeClasses
+ , TypeOperators
+ , DeriveDataTypeable
+ , DefaultSignatures
+ , BangPatterns
+ , FunctionalDependencies
+ , PartialTypeSignatures
+ , BlockArguments
+ , LambdaCase
+ , StandaloneDeriving
+ , TemplateHaskell
+ , OverloadedRecordDot
\ No newline at end of file
diff --git a/ihp-ide/IHP/IDE/CodeGen/MigrationGenerator.hs b/ihp-ide/IHP/IDE/CodeGen/MigrationGenerator.hs
index c21631b52..41eb714ec 100644
--- a/ihp-ide/IHP/IDE/CodeGen/MigrationGenerator.hs
+++ b/ihp-ide/IHP/IDE/CodeGen/MigrationGenerator.hs
@@ -168,6 +168,7 @@ diffSchemas targetSchema' actualSchema' = (drop <> create)
toDropStatement CreatePolicy { tableName, name } = Just DropPolicy { tableName, policyName = name }
toDropStatement CreateFunction { functionName } = Just DropFunction { functionName }
toDropStatement CreateTrigger { name, tableName } = Just DropTrigger { name, tableName }
+ toDropStatement CreateEventTrigger { name } = Just DropEventTrigger { name }
toDropStatement otherwise = Nothing
@@ -188,7 +189,7 @@ removeNoise = filter \case
StatementCreateTable { unsafeGetCreateTable = CreateTable { name = "schema_migrations" } } -> False
AddConstraint { tableName = "schema_migrations" } -> False
CreateFunction { functionName } | "notify_" `Text.isPrefixOf` functionName -> False
- CreateTrigger { name } | any (`Text.isPrefixOf` name) ["did_update_", "did_delete_", "did_insert_"] -> False
+ CreateTrigger { name } | any (`Text.isPrefixOf` name) ["did_update_", "did_delete_", "did_insert_", "ar_did_update_", "ar_did_delete_", "ar_did_insert_"] -> False
StatementCreateTable { unsafeGetCreateTable = CreateTable { name = "large_pg_notifications" } } -> False
CreateIndex { tableName = "large_pg_notifications" } -> False
_ -> True
diff --git a/ihp-ide/IHP/IDE/CodeGen/ViewGenerator.hs b/ihp-ide/IHP/IDE/CodeGen/ViewGenerator.hs
index 7ffe6d24c..3cecdae59 100644
--- a/ihp-ide/IHP/IDE/CodeGen/ViewGenerator.hs
+++ b/ihp-ide/IHP/IDE/CodeGen/ViewGenerator.hs
@@ -15,8 +15,8 @@ data ViewConfig = ViewConfig
} deriving (Eq, Show)
buildPlan :: Text -> Text -> Text -> IO (Either Text [GeneratorAction])
-buildPlan viewName applicationName controllerName' =
- if (null viewName || null controllerName')
+buildPlan viewName' applicationName controllerName' =
+ if (null viewName' || null controllerName')
then pure $ Left "Neither view name nor controller name can be empty"
else do
schema <- SchemaDesigner.parseSchemaSql >>= \case
@@ -24,6 +24,7 @@ buildPlan viewName applicationName controllerName' =
Right statements -> pure statements
let modelName = tableNameToModelName controllerName'
let controllerName = tableNameToControllerName controllerName'
+ let viewName = tableNameToViewName viewName'
let paginationEnabled = False
let viewConfig = ViewConfig { .. }
pure $ Right $ buildPlan' schema viewConfig
@@ -80,7 +81,7 @@ buildPlan' schema config =
genericView = [trimming|
${viewHeader}
- data ${nameWithSuffix} = {$nameWithSuffix}
+ data ${nameWithSuffix} = ${nameWithSuffix}
instance View ${nameWithSuffix} where
html ${nameWithSuffix} { .. } = [hsx|
diff --git a/ihp-ide/IHP/IDE/Postgres.hs b/ihp-ide/IHP/IDE/Postgres.hs
index 72d3b39d7..6c287c66e 100644
--- a/ihp-ide/IHP/IDE/Postgres.hs
+++ b/ihp-ide/IHP/IDE/Postgres.hs
@@ -19,7 +19,7 @@ startPostgres = do
shouldInit <- needsDatabaseInit
when shouldInit initDatabase
let args = ["-D", "build/db/state", "-k", currentDir <> "/build/db", "-c", "listen_addresses="]
- let params = (Process.proc "postgres" args)
+ let params = (procDirenvAware "postgres" args)
{ Process.std_in = Process.CreatePipe
, Process.std_out = Process.CreatePipe
, Process.std_err = Process.CreatePipe
diff --git a/ihp-ide/IHP/IDE/SchemaDesigner/Compiler.hs b/ihp-ide/IHP/IDE/SchemaDesigner/Compiler.hs
index b05af3c59..aab65287d 100644
--- a/ihp-ide/IHP/IDE/SchemaDesigner/Compiler.hs
+++ b/ihp-ide/IHP/IDE/SchemaDesigner/Compiler.hs
@@ -55,6 +55,8 @@ compileStatement UnknownStatement { raw } = raw <> ";"
compileStatement Set { name, value } = "SET " <> compileIdentifier name <> " = " <> compileExpression value <> ";"
compileStatement SelectStatement { query } = "SELECT " <> query <> ";"
compileStatement DropTrigger { name, tableName } = "DROP TRIGGER " <> compileIdentifier name <> " ON " <> compileIdentifier tableName <> ";"
+compileStatement CreateEventTrigger { name, eventOn, whenCondition, functionName, arguments } = "CREATE EVENT TRIGGER " <> compileIdentifier name <> " ON " <> compileIdentifier eventOn <> " " <> (maybe "" (\expression -> "WHEN " <> compileExpression expression) whenCondition) <> " EXECUTE FUNCTION " <> compileExpression (CallExpression functionName arguments) <> ";"
+compileStatement DropEventTrigger { name } = "DROP EVENT TRIGGER " <> compileIdentifier name <> ";"
-- | Emit a PRIMARY KEY constraint when there are multiple primary key columns
compilePrimaryKeyConstraint :: PrimaryKeyConstraint -> Maybe Text
@@ -132,6 +134,7 @@ compileExpression (EqExpression a b) = compileExpressionWithOptionalParenthese a
compileExpression (IsExpression a (NotExpression b)) = compileExpressionWithOptionalParenthese a <> " IS NOT " <> compileExpressionWithOptionalParenthese b -- 'IS (NOT NULL)' => 'IS NOT NULL'
compileExpression (IsExpression a b) = compileExpressionWithOptionalParenthese a <> " IS " <> compileExpressionWithOptionalParenthese b
compileExpression (InExpression a b) = compileExpressionWithOptionalParenthese a <> " IN " <> compileExpressionWithOptionalParenthese b
+compileExpression (InArrayExpression values) = "(" <> intercalate ", " (map compileExpression values) <> ")"
compileExpression (NotExpression a) = "NOT " <> compileExpressionWithOptionalParenthese a
compileExpression (AndExpression a b) = compileExpressionWithOptionalParenthese a <> " AND " <> compileExpressionWithOptionalParenthese b
compileExpression (OrExpression a b) = compileExpressionWithOptionalParenthese a <> " OR " <> compileExpressionWithOptionalParenthese b
@@ -161,6 +164,7 @@ compileExpressionWithOptionalParenthese expr@(IntExpression {}) = compileExpress
compileExpressionWithOptionalParenthese expr@(DoubleExpression {}) = compileExpression expr
compileExpressionWithOptionalParenthese expr@(DotExpression (VarExpression {}) b) = compileExpression expr
compileExpressionWithOptionalParenthese expr@(ConcatenationExpression a b ) = compileExpression expr
+compileExpressionWithOptionalParenthese expr@(InArrayExpression values) = compileExpression expr
compileExpressionWithOptionalParenthese expression = "(" <> compileExpression expression <> ")"
compareStatement (CreateEnumType {}) _ = LT
@@ -201,6 +205,7 @@ compilePostgresType PInet = "INET"
compilePostgresType PTSVector = "TSVECTOR"
compilePostgresType (PArray type_) = compilePostgresType type_ <> "[]"
compilePostgresType PTrigger = "TRIGGER"
+compilePostgresType PEventTrigger = "EVENT_TRIGGER"
compilePostgresType (PCustomType theType) = theType
compileIdentifier :: Text -> Text
diff --git a/ihp-ide/IHP/IDE/SchemaDesigner/Parser.hs b/ihp-ide/IHP/IDE/SchemaDesigner/Parser.hs
index 753e0c21c..eb83f06fa 100644
--- a/ihp-ide/IHP/IDE/SchemaDesigner/Parser.hs
+++ b/ihp-ide/IHP/IDE/SchemaDesigner/Parser.hs
@@ -272,6 +272,7 @@ sqlType = choice $ map optionalArray
, inet
, tsvector
, trigger
+ , eventTrigger
, singleChar
, customType
]
@@ -429,6 +430,10 @@ sqlType = choice $ map optionalArray
trigger = do
try (symbol' "TRIGGER")
pure PTrigger
+
+ eventTrigger = do
+ try (symbol' "EVENT_TRIGGER")
+ pure PEventTrigger
customType = do
optional do
@@ -459,7 +464,7 @@ table = [
, binary "||" ConcatenationExpression
, binary "IS" IsExpression
- , binary "IN" InExpression
+ , inExpr
, prefix "NOT" NotExpression
, prefix "EXISTS" ExistsExpression
, typeCast
@@ -483,6 +488,11 @@ table = [
char '.'
name <- identifier
pure $ \expr -> DotExpression expr name
+
+ inExpr = Postfix do
+ lexeme "IN"
+ right <- try inArrayExpression <|> expression
+ pure $ \expr -> InExpression expr right
-- | Parses a SQL expression
--
@@ -543,7 +553,10 @@ selectExpr = do
whereClause Nothing <|> explicitAs <|> implicitAs
-
+inArrayExpression :: Parser Expression
+inArrayExpression = do
+ values <- between (char '(') (char ')') (expression `sepBy` (char ',' >> space))
+ pure (InArrayExpression values)
@@ -599,6 +612,7 @@ createFunction = do
space
lexeme "RETURNS"
returns <- sqlType
+ space
language <- optional do
lexeme "language" <|> lexeme "LANGUAGE"
@@ -625,6 +639,38 @@ createFunction = do
createTrigger = do
lexeme "CREATE"
+ createEventTrigger <|> createTrigger'
+
+createEventTrigger = do
+ lexeme "EVENT"
+ lexeme "TRIGGER"
+
+ name <- qualifiedIdentifier
+ lexeme "ON"
+ eventOn <- identifier
+
+ whenCondition <- optional do
+ lexeme "WHEN"
+ expression
+
+ lexeme "EXECUTE"
+ (lexeme "FUNCTION") <|> (lexeme "PROCEDURE")
+
+ (CallExpression functionName arguments) <- callExpr
+
+ char ';'
+
+ pure CreateEventTrigger
+ { name
+ , eventOn
+ , whenCondition
+ , functionName
+ , arguments
+ }
+
+
+
+createTrigger' = do
lexeme "TRIGGER"
name <- qualifiedIdentifier
@@ -870,6 +916,10 @@ dropPolicy = do
dropTrigger = do
lexeme "DROP"
+
+ dropEventTrigger <|> dropTrigger'
+
+dropTrigger' = do
lexeme "TRIGGER"
name <- qualifiedIdentifier
lexeme "ON"
@@ -877,6 +927,14 @@ dropTrigger = do
char ';'
pure DropTrigger { name, tableName }
+
+dropEventTrigger = do
+ lexeme "EVENT"
+ lexeme "TRIGGER"
+ name <- qualifiedIdentifier
+ char ';'
+ pure DropEventTrigger { name }
+
createSequence = do
lexeme "CREATE"
lexeme "SEQUENCE"
diff --git a/ihp-ide/IHP/IDE/SchemaDesigner/Types.hs b/ihp-ide/IHP/IDE/SchemaDesigner/Types.hs
index 1d9fbf465..bf8af1657 100644
--- a/ihp-ide/IHP/IDE/SchemaDesigner/Types.hs
+++ b/ihp-ide/IHP/IDE/SchemaDesigner/Types.hs
@@ -64,8 +64,12 @@ data Statement
| DropDefaultValue { tableName :: Text, columnName :: Text }
-- | CREATE TRIGGER ..;
| CreateTrigger { name :: !Text, eventWhen :: !TriggerEventWhen, event :: !TriggerEvent, tableName :: !Text, for :: !TriggerFor, whenCondition :: Maybe Expression, functionName :: !Text, arguments :: ![Expression] }
+ -- | CREATE EVENT TRIGGER ..;
+ | CreateEventTrigger { name :: !Text, eventOn :: !Text, whenCondition :: Maybe Expression, functionName :: !Text, arguments :: ![Expression] }
-- | DROP TRIGGER .. ON ..;
| DropTrigger { name :: !Text, tableName :: !Text }
+ -- | DROP EVENT TRIGGER ..;
+ | DropEventTrigger { name :: !Text }
-- | BEGIN;
| Begin
-- | COMMIT;
@@ -165,6 +169,8 @@ data Expression =
| IsExpression Expression Expression
-- | a IN b
| InExpression Expression Expression
+ -- | ('a', 'b')
+ | InArrayExpression [Expression]
-- | NOT a
| NotExpression Expression
-- | EXISTS a
@@ -225,6 +231,7 @@ data PostgresType
| PTSVector
| PArray PostgresType
| PTrigger
+ | PEventTrigger
| PCustomType Text
deriving (Eq, Show)
diff --git a/ihp-ide/IHP/IDE/Types.hs b/ihp-ide/IHP/IDE/Types.hs
index efc287240..ee248aa02 100644
--- a/ihp-ide/IHP/IDE/Types.hs
+++ b/ihp-ide/IHP/IDE/Types.hs
@@ -21,6 +21,12 @@ data ManagedProcess = ManagedProcess
, processHandle :: !ProcessHandle
} deriving (Show)
+procDirenvAware :: (?context :: Context) => FilePath -> [String] -> Process.CreateProcess
+procDirenvAware command args =
+ if ?context.wrapWithDirenv
+ then Process.proc "direnv" (["exec", ".", command] <> args)
+ else Process.proc command args
+
createManagedProcess :: CreateProcess -> IO ManagedProcess
createManagedProcess config = do
process <- Process.createProcess config
@@ -128,6 +134,7 @@ data Context = Context
, ghciInChan :: !(Queue.InChan OutputLine) -- ^ Output of the app ghci is written here
, ghciOutChan :: !(Queue.OutChan OutputLine) -- ^ Output of the app ghci is consumed here
, liveReloadClients :: !(IORef (Map UUID Websocket.Connection))
+ , wrapWithDirenv :: !Bool
}
dispatch :: (?context :: Context) => Action -> IO ()
diff --git a/ihp-ide/IHP/SchemaCompiler.hs b/ihp-ide/IHP/SchemaCompiler.hs
index 260a13fe9..3044cfd47 100644
--- a/ihp-ide/IHP/SchemaCompiler.hs
+++ b/ihp-ide/IHP/SchemaCompiler.hs
@@ -354,7 +354,7 @@ compileInputValueInstance :: CreateTable -> Text
compileInputValueInstance table =
"instance InputValue " <> modelName <> " where inputValue = IHP.ModelSupport.recordToInputValue\n"
where
- modelName = tableNameToModelName table.name
+ modelName = qualifiedConstructorNameFromTableName table.name
-- | Returns all the type arguments of the data structure for an entity
dataTypeArguments :: (?schema :: Schema) => CreateTable -> [Text]
@@ -542,18 +542,30 @@ columnPlaceholder column@Column { columnType } = if columnPlaceholderNeedsTypeca
columnPlaceholderNeedsTypecast Column { columnType = PArray {} } = True
columnPlaceholderNeedsTypecast _ = False
+qualifiedConstructorNameFromTableName :: Text -> Text
+qualifiedConstructorNameFromTableName unqualifiedName = "Generated.ActualTypes." <> (tableNameToModelName unqualifiedName)
+
compileCreate :: CreateTable -> Text
compileCreate table@(CreateTable { name, columns }) =
let
writableColumns = onlyWritableColumns columns
- modelName = tableNameToModelName name
+ modelName = qualifiedConstructorNameFromTableName name
columnNames = commaSep (map (.name) writableColumns)
values = commaSep (map columnPlaceholder writableColumns)
toBinding column@(Column { name }) =
- if hasExplicitOrImplicitDefault column
- then "fieldWithDefault #" <> columnNameToFieldName name <> " model"
- else "model." <> columnNameToFieldName name
+ if hasExplicitOrImplicitDefault column && not isArrayColumn
+ then "fieldWithDefault #" <> columnNameToFieldName name <> " model"
+ else "model." <> columnNameToFieldName name
+ where
+ -- We cannot use DEFAULT with array columns as postgres will throw an error:
+ --
+ -- > DEFAULT is not allowed in this context
+ --
+ -- To walk around this error, we explicitly specify an empty array.
+ isArrayColumn = case column.columnType of
+ PArray _ -> True
+ _ -> False
bindings :: [Text]
@@ -591,7 +603,7 @@ onlyWritableColumns columns = columns |> filter (\Column { generator } -> isNoth
compileUpdate :: CreateTable -> Text
compileUpdate table@(CreateTable { name, columns }) =
let
- modelName = tableNameToModelName name
+ modelName = qualifiedConstructorNameFromTableName name
writableColumns = onlyWritableColumns columns
toUpdateBinding Column { name } = "fieldWithUpdate #" <> columnNameToFieldName name <> " model"
@@ -642,7 +654,7 @@ instance FromRow #{modelName} where
|]
where
- modelName = tableNameToModelName name
+ modelName = qualifiedConstructorNameFromTableName name
columnNames = map (columnNameToFieldName . (.name)) columns
columnBinding columnName = columnName <> " <- field"
@@ -721,9 +733,12 @@ instance FromRow #{modelName} where
compileBuild :: (?schema :: Schema) => CreateTable -> Text
compileBuild table@(CreateTable { name, columns }) =
- "instance Record " <> tableNameToModelName name <> " where\n"
+ let
+ constructor = qualifiedConstructorNameFromTableName name
+ in
+ "instance Record " <> constructor <> " where\n"
<> " {-# INLINE newRecord #-}\n"
- <> " newRecord = " <> tableNameToModelName name <> " " <> unwords (map toDefaultValueExpr columns) <> " " <> (columnsReferencingTable name |> map (const "def") |> unwords) <> " def\n"
+ <> " newRecord = " <> constructor <> " " <> unwords (map toDefaultValueExpr columns) <> " " <> (columnsReferencingTable name |> map (const "def") |> unwords) <> " def\n"
compileDefaultIdInstance :: CreateTable -> Text
@@ -762,7 +777,7 @@ toDefaultValueExpr _ = "def"
compileHasTableNameInstance :: (?schema :: Schema) => CreateTable -> Text
compileHasTableNameInstance table@(CreateTable { name }) =
"type instance GetTableName (" <> tableNameToModelName name <> "' " <> unwords (map (const "_") (dataTypeArguments table)) <> ") = " <> tshow name <> "\n"
- <> "type instance GetModelByTableName " <> tshow name <> " = " <> tableNameToModelName name <> "\n"
+ <> "type instance GetModelByTableName " <> tshow name <> " = Generated.ActualTypes." <> tableNameToModelName name <> "\n"
compilePrimaryKeyInstance :: (?schema :: Schema) => CreateTable -> Text
compilePrimaryKeyInstance table@(CreateTable { name, columns, constraints }) = [trimming|type instance PrimaryKey $symbol = $idType|] <> "\n"
@@ -807,7 +822,7 @@ instance #{instanceHead} where
|]
where
instanceHead :: Text
- instanceHead = instanceConstraints <> " => Table (" <> compileTypePattern table <> ")"
+ instanceHead = instanceConstraints <> " => IHP.ModelSupport.Table (" <> compileTypePattern table <> ")"
where
instanceConstraints =
table
diff --git a/ihp-ide/exe/IHP/IDE/DevServer.hs b/ihp-ide/exe/IHP/IDE/DevServer.hs
index a5ccb5cc2..9293650af 100644
--- a/ihp-ide/exe/IHP/IDE/DevServer.hs
+++ b/ihp-ide/exe/IHP/IDE/DevServer.hs
@@ -1,4 +1,4 @@
-module Main (main) where
+module Main (main, mainInParentDirectory) where
import ClassyPrelude
import qualified System.Process as Process
@@ -30,9 +30,25 @@ import qualified IHP.FrameworkConfig as FrameworkConfig
import qualified Control.Concurrent.Chan.Unagi as Queue
import IHP.IDE.FileWatcher
import qualified System.Environment as Env
+import qualified System.Directory as Directory
+
+mainInParentDirectory :: IO ()
+mainInParentDirectory = do
+ cwd <- Directory.getCurrentDirectory
+ let projectDir = cwd <> "/../"
+ Directory.setCurrentDirectory projectDir
+
+ Env.setEnv "IHP_LIB" (cwd <> "/ihp-ide/lib/IHP")
+ Env.setEnv "TOOLSERVER_STATIC" (cwd <> "/ihp-ide/lib/IHP/static")
+ Env.setEnv "IHP_STATIC" (cwd <> "/lib/IHP/static")
+
+ mainWithOptions True
main :: IO ()
-main = withUtf8 do
+main = mainWithOptions False
+
+mainWithOptions :: Bool -> IO ()
+mainWithOptions wrapWithDirenv = withUtf8 do
actionVar <- newEmptyMVar
appStateRef <- emptyAppState >>= newIORef
portConfig <- findAvailablePortConfig
@@ -45,7 +61,7 @@ main = withUtf8 do
logger <- Log.newLogger def
(ghciInChan, ghciOutChan) <- Queue.newChan
liveReloadClients <- newIORef mempty
- let ?context = Context { actionVar, portConfig, appStateRef, isDebugMode, logger, ghciInChan, ghciOutChan, liveReloadClients }
+ let ?context = Context { actionVar, portConfig, appStateRef, isDebugMode, logger, ghciInChan, ghciOutChan, liveReloadClients, wrapWithDirenv }
-- Print IHP Version when in debug mode
when isDebugMode (Log.debug ("IHP Version: " <> Version.ihpVersion))
@@ -215,7 +231,7 @@ startOrWaitPostgres = do
startPostgres
pure ()
-startGHCI :: IO ManagedProcess
+startGHCI :: (?context :: Context) => IO ManagedProcess
startGHCI = do
let args =
[ "-threaded"
@@ -227,7 +243,8 @@ startGHCI = do
, "-ghci-script", ".ghci" -- Because the previous line ignored default ghci config file locations, we have to manual load our .ghci
, "+RTS", "-A128m", "-n2m", "-H2m", "--nonmoving-gc", "-N"
]
- createManagedProcess (Process.proc "ghci" args)
+
+ createManagedProcess (procDirenvAware "ghci" args)
{ Process.std_in = Process.CreatePipe
, Process.std_out = Process.CreatePipe
, Process.std_err = Process.CreatePipe
diff --git a/lib/IHP/DataSync/ihp-datasync.js b/lib/IHP/DataSync/ihp-datasync.js
index 4800c31f7..345834fb2 100644
--- a/lib/IHP/DataSync/ihp-datasync.js
+++ b/lib/IHP/DataSync/ihp-datasync.js
@@ -233,9 +233,10 @@ class DataSyncController {
const APPEND_NEW_RECORD = 0;
const PREPEND_NEW_RECORD = 1;
+export const NewRecordBehaviour = { APPEND_NEW_RECORD, PREPEND_NEW_RECORD };
class DataSubscription {
- constructor(query) {
+ constructor(query, options = null, cache = null) {
if (typeof query !== "object" || !('table' in query)) {
throw new Error("Query passed to `new DataSubscription(..)` doesn't look like a query object. If you're using the `query()` functions to costruct the object, make sure you pass the `.query` property, like this: `new DataSubscription(query('my_table').orderBy('createdAt').query)`");
}
@@ -250,7 +251,18 @@ class DataSubscription {
this.connectError = null;
this.subscriptionId = null;
this.subscribers = [];
- this.records = [];
+
+ if (cache) {
+ const cacheResults = cache.get(JSON.stringify(query));
+ if (cacheResults !== undefined) {
+ this.records = cacheResults;
+ } else {
+ this.records = null;
+ }
+ } else {
+ this.records = null;
+ }
+ this.cache = cache;
this.getRecords = this.getRecords.bind(this);
this.subscribe = this.subscribe.bind(this);
@@ -259,7 +271,7 @@ class DataSubscription {
this.onMessage = this.onMessage.bind(this);
// When a new record is inserted, do we put it at the end or at the beginning?
- this.newRecordBehaviour = this.detectNewRecordBehaviour();
+ this.newRecordBehaviour = (options && 'newRecordBehaviour' in options) ? options.newRecordBehaviour : this.detectNewRecordBehaviour();
this.optimisticCreatedPendingRecordIds = [];
}
@@ -418,6 +430,9 @@ class DataSubscription {
}
updateSubscribers() {
+ if (this.cache) {
+ this.cache.set(JSON.stringify(this.query), this.records);
+ }
for (const subscriber of this.subscribers) {
subscriber(this.records);
}
diff --git a/lib/IHP/DataSync/ihp-querybuilder.js b/lib/IHP/DataSync/ihp-querybuilder.js
index 4fb961568..3517d5e03 100644
--- a/lib/IHP/DataSync/ihp-querybuilder.js
+++ b/lib/IHP/DataSync/ihp-querybuilder.js
@@ -424,7 +424,7 @@ export function recordMatchesQuery(query, record) {
default: throw new Error('Unsupported operator ' + expression.op);
}
}
- case 'LiteralExpression': return evaluateDynamicValue(expression);
+ case 'LiteralExpression': return evaluateDynamicValue(expression.value);
case 'ListExpression': return expression.values.map(value => evaluateDynamicValue(value));
default: throw new Error('Unsupported expression in evaluate: ' + expression.tag);
}
diff --git a/lib/IHP/DataSync/index.js b/lib/IHP/DataSync/index.js
index 7577b8883..1674daaaa 100644
--- a/lib/IHP/DataSync/index.js
+++ b/lib/IHP/DataSync/index.js
@@ -19,7 +19,7 @@ import {
whereGreaterThan,
whereGreaterThanOrEqual
} from './ihp-querybuilder.js';
-import { DataSyncController, DataSubscription, initIHPBackend, createRecord, createRecords, updateRecord, updateRecords, deleteRecord, deleteRecords } from './ihp-datasync.js';
+import { DataSyncController, DataSubscription, initIHPBackend, createRecord, createRecords, updateRecord, updateRecords, deleteRecord, deleteRecords, NewRecordBehaviour } from './ihp-datasync.js';
import { Transaction, withTransaction } from './transaction.js';
export {
@@ -45,7 +45,7 @@ export {
whereGreaterThanOrEqual,
/* ihp-datasync.js */
- DataSyncController, DataSubscription, initIHPBackend, createRecord, createRecords, updateRecord, updateRecords, deleteRecord, deleteRecords,
+ DataSyncController, DataSubscription, initIHPBackend, createRecord, createRecords, updateRecord, updateRecords, deleteRecord, deleteRecords, NewRecordBehaviour,
/* transaction.js */
Transaction, withTransaction
diff --git a/lib/IHP/DataSync/react.js b/lib/IHP/DataSync/react.js
index 31cdfe7ca..ba008e110 100644
--- a/lib/IHP/DataSync/react.js
+++ b/lib/IHP/DataSync/react.js
@@ -1,15 +1,10 @@
-import React, { useState, useEffect, useContext } from 'react';
+import React, { useState, useEffect, useContext, useSyncExternalStore } from 'react';
import { DataSubscription, DataSyncController } from './ihp-datasync.js';
// Most IHP apps never use this context because they use session cookies for auth.
// Therefore the default value is true.
export const AuthCompletedContext = React.createContext(true);
-// To avoid too many loading spinners when going backwards and forwards
-// between pages, we cache the result of queries so we can already showing
-// some data directly after a page transition. The data might be a bit
-// outdated, but it will directly be overriden with the latest server state
-// once it has arrived.
const recordsCache = new Map();
/**
@@ -17,41 +12,18 @@ const recordsCache = new Map();
* @example
* const messages = useQuery(query('messages').orderBy('createdAt'));
*/
-export function useQuery(queryBuilder) {
- const [records, setRecords] = useState(() => {
- const strinigifiedQuery = JSON.stringify(queryBuilder.query);
- const cachedRecords = recordsCache.get(strinigifiedQuery);
- return cachedRecords === undefined ? null : cachedRecords;
- });
+export function useQuery(queryBuilder, options = null) {
+ const dataSubscription = DataSubscriptionStore.get(queryBuilder.query, options);
const isAuthCompleted = useContext(AuthCompletedContext);
+ const records = useSyncExternalStore(dataSubscription.subscribe, dataSubscription.getRecords)
- useEffect(() => {
- if (!isAuthCompleted) {
- return;
- }
-
- const strinigifiedQuery = JSON.stringify(queryBuilder.query);
- const cachedRecords = recordsCache.get(strinigifiedQuery);
-
- // Invalidate existing records, as the query might have been changed
- setRecords(cachedRecords === undefined ? null : cachedRecords);
-
- const dataSubscription = new DataSubscription(queryBuilder.query);
- dataSubscription.createOnServer();
+ if (dataSubscription.connectError) {
+ throw dataSubscription.connectError;
+ }
- // The dataSubscription is automatically closed when the last subscriber on
- // the DataSubscription object has been unsubscribed
-
- return dataSubscription.subscribe(records => {
- setRecords(records);
-
- // Update the cache whenever the records change
- recordsCache.set(strinigifiedQuery, records);
- });
- }, [
- JSON.stringify(queryBuilder.query) /* <-- It's terrible - but it works, we should find a better for this */,
- isAuthCompleted
- ])
+ if (!isAuthCompleted) {
+ return null;
+ }
return records;
}
@@ -89,4 +61,42 @@ export function useIsConnected() {
}, [ setConnected ]);
return isConnected;
+}
+
+export class DataSubscriptionStore {
+ static queryMap = new Map();
+
+ // To avoid too many loading spinners when going backwards and forwards
+ // between pages, we cache the result of queries so we can already showing
+ // some data directly after a page transition. The data might be a bit
+ // outdated, but it will directly be overriden with the latest server state
+ // once it has arrived.
+ static cache = new Map();
+
+ static get(query, options = null) {
+ const key = JSON.stringify(query) + JSON.stringify(options);
+ const existingSubscription = DataSubscriptionStore.queryMap.get(key)
+
+ if (existingSubscription) {
+ return existingSubscription;
+ } else {
+
+ const subscription = new DataSubscription(query, options, DataSubscriptionStore.cache);
+ subscription.createOnServer();
+ subscription.onClose = () => { DataSubscriptionStore.queryMap.delete(key); };
+
+ DataSubscriptionStore.queryMap.set(key, subscription);
+
+ // If the query changes very rapid in `useQuery` it can happen that the `dataSubscription.subscribe`
+ // is never called at all. In this case we have a unused DataSubscription laying around. We avoid
+ // to many open connections laying around by trying to close them a second after opening them.
+ // A second is enough time for react to call the subscribe function. If it's not called by then,
+ // we most likely deal with a dead subscription, so we close it.
+ setTimeout(() => {
+ subscription.closeIfNotUsed();
+ }, 1000);
+
+ return subscription;
+ }
+ }
}
\ No newline at end of file
diff --git a/lib/IHP/DataSync/react18.js b/lib/IHP/DataSync/react18.js
deleted file mode 100644
index dbb8af1de..000000000
--- a/lib/IHP/DataSync/react18.js
+++ /dev/null
@@ -1,49 +0,0 @@
-import React, { useState, useEffect, useSyncExternalStore } from 'react';
-import { DataSubscription } from './ihp-datasync.js';
-
-/**
- * Returns the result of the current query in real-time. Suspends while the data is still being fetched from the server.
- * @example
- * const messages = useQuery(query('messages').orderBy('createdAt'));
- */
-export function useQuery(queryBuilder) {
- const dataSubscription = DataSubscriptionStore.get(queryBuilder.query);
-
- if (dataSubscription.isConnected) {
- const records = useSyncExternalStore(dataSubscription.subscribe, dataSubscription.getRecords)
- return records;
- } else if (dataSubscription.connectError) {
- throw dataSubscription.connectError;
- } else {
- throw dataSubscription.createOnServerPromise;
- }
-}
-
-export class DataSubscriptionStore {
- static queryMap = new Map();
- static get(query) {
- const strinigifiedQuery = JSON.stringify(query);
- const existingSubscription = DataSubscriptionStore.queryMap.get(strinigifiedQuery)
-
- if (existingSubscription) {
- return existingSubscription;
- } else {
- const subscription = new DataSubscription(query);
- subscription.createOnServer();
- subscription.onClose = () => { DataSubscriptionStore.queryMap.delete(strinigifiedQuery); };
-
- DataSubscriptionStore.queryMap.set(strinigifiedQuery, subscription);
-
- // If the query changes very rapid in `useQuery` it can happen that the `dataSubscription.subscribe`
- // is never called at all. In this case we have a unused DataSubscription laying around. We avoid
- // to many open connections laying around by trying to close them a second after opening them.
- // A second is enough time for react to call the subscribe function. If it's not called by then,
- // we most likely deal with a dead subscription, so we close it.
- setTimeout(() => {
- subscription.closeIfNotUsed();
- }, 1000);
-
- return subscription;
- }
- }
-}
\ No newline at end of file