Skip to content

Commit

Permalink
Merge branch 'digitallyinduced:master' into master
Browse files Browse the repository at this point in the history
  • Loading branch information
Montmorency authored Nov 19, 2024
2 parents 040962d + 44d8992 commit 58b252d
Show file tree
Hide file tree
Showing 60 changed files with 2,101 additions and 640 deletions.
3 changes: 2 additions & 1 deletion .github/workflows/tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,8 @@ name: "Tests + Compile"
on:
pull_request:
branches: [master]

push:
branches: 'master'
jobs:
tests:
runs-on: ARM64
Expand Down
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -22,3 +22,6 @@ devenv.local.nix
result*

.idea

# Test folders
static/Test.FileStorage.ControllerFunctionsSpec
29 changes: 29 additions & 0 deletions Guide/deployment.markdown
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,35 @@ AWS EC2 is a good choice for deploying IHP in a professional setup.

### AWS infrastructure preparation

#### Creating infrastructure with Terraform

The EC2 instance, RDS database, VPS, subnets, security groups, etc, can be setup automatically using [Terraform](https://www.terraform.io/).

1. Install terraform
2. Setup AWS credentials in `.aws/config` and `.aws/credentials`
3. Copy the files from the IaC/aws folder from [the branch IaC-aws in ihp-boilerplate](https://github.com/digitallyinduced/ihp-boilerplate/tree/IaC-aws) to you IHP project repo. Run the init command from the IaC/aws folder:
```
terraform init
```
4. Create the file `terraform.tfvars` with the following content:
```
prefix = "Project prefix for the resource names"
region = "AWS Region to deploy to"
az_1 = "Availability Zone 1"
az_2 = "Availability Zone 2"
key_name = "The key name of the SSH key-pair"
db_password = "The password for the RDS database"
```
- The two AZs are needed to setup the RDS database.
- The SSH key-pair should be created in the AWS web interface.
5. Run:
```
terraform apply
```
6. Important data like the RDS endpoint and the EC2 instance URL is written to the file `db_info.txt`

Now the NixOS instance and Postgres database is setup and an SSH conncetion can be established to it.

#### Creating a new EC2 Instance

Start a new EC2 instance and use the official NixOS AMI `NixOS-23.05.426.afc48694f2a-x86_64-linux`. You can find the latest NixOS AMI at https://nixos.org/download#nixos-amazon
Expand Down
96 changes: 96 additions & 0 deletions Guide/hsx.markdown
Original file line number Diff line number Diff line change
Expand Up @@ -278,6 +278,102 @@ The underlying HTML library blaze currently does not support an empty HTML attri

If you use HTML entities, such as ` ` for a non-breaking space, you will notice they appear exactly like that. To output directly (i.e. unescaped) use the method `preEscapedToMarkup` from `Text.Blaze.Html5`.

### Custom HSX and Unchecked HSX

HSX provides two additional QuasiQuoters beyond the standard `[hsx|...|]` for increased flexibility: `uncheckedHsx` and `customHsx`.

#### Using `uncheckedHsx`

`uncheckedHsx` provides a quick way to bypass HSX's strict tag and attribute name checking.

It will still check for a valid HTML structure, but it will accept any tag and attribute names.


```haskell
[uncheckedHsx|
<anytagname custom-attribute="value">
Content
</anytagname>
|]
```

While convenient for rapid development, use it with caution as you lose the benefits of compile-time guarantees for your markup.

#### Using `customHsx`

`customHsx` allows you to extend the default HSX with additional whitelisted tag names and attribute names while maintaining the same strict compile-time checking of the default `hsx`.

This makes it easier to use custom elements that often also contain special attributes, and javascript libraries, for example `_hyperscript`, that use the `_` as an attribute name.


To use `customHsx`, you need to create it in a separate module due to Template Haskell restrictions. Here's how to set it up:

1. First, create a new module for your custom HSX (e.g., `Application.Helper.CustomHsx`):

```haskell
module Application.Helper.CustomHsx where

import IHP.Prelude
import IHP.HSX.QQ (customHsx)
import IHP.HSX.Parser
import Language.Haskell.TH.Quote
import qualified Data.Set as Set

myHsx :: QuasiQuoter
myHsx = customHsx
(HsxSettings
{ checkMarkup = True
, additionalTagNames = Set.fromList ["book", "heading", "name"]
, additionalAttributeNames = Set.fromList ["_", "custom-attribute"]
}
)
```

Configuration options for `HsxSettings`:
- `checkMarkup`: Boolean to enable/disable markup checking
- `additionalTagNames`: Set of additional allowed tag names
- `additionalAttributeNames`: Set of additional allowed attribute names

2. Make it available in your views by adding it to your view helpers module:

```haskell
module Application.Helper.View (
module Application.Helper.View,
module Application.Helper.CustomHsx -- Add this line
) where

import IHP.ViewPrelude
import Application.Helper.CustomHsx (myHsx) -- Add this line
```

3. Use it in your views:

```haskell
[myHsx|
<book _="on click log 'Hello'">
<heading custom-attribute="value">My Book</heading>
<name>Author Name</name>
</book>
|]
```

The custom HSX will validate that tags and attributes are either in the default HSX whitelist or in your additional sets. This gives you the flexibility to use custom elements and attributes.

This approach is particularly useful for:
- Web Components with custom attribute names
- UI libraries with non-standard attributes
- Domain-specific XML markup languages like [Hyperview](https://hyperview.org/docs/example_navigation)
- Integration with third-party frameworks that extend HTML syntax

`customHsx` whitelisting and even `uncheckedHsx` does not entirely help for libraries with very unusual symbols in their attributes, like Alpine.js, because they don't recognize html attributes starting with `@` or has `:` in the attribute name. In these cases, the spread syntax `{...attributeList}` is likely your best bet.

```haskell
-- This will not work
[uncheckedHsx|<button @click="open = true">Expand</button>|]

-- Using spread syntax will work
[hsx|<button {...[("@click", "open = true" :: Text)]}>Expand</button>|]
```

## Common HSX Patterns

Expand Down
60 changes: 33 additions & 27 deletions Guide/package-management.markdown
Original file line number Diff line number Diff line change
Expand Up @@ -371,47 +371,53 @@ After that try to run `devenv up`.

### Building Postgres With Extensions

**TODO: Fix this for IHP v1.1.0**

For some applications you may want to install custom postgres extension
libraries and have them available in the nix store.

For example to enable the [postgis](https://postgis.net/) spatial
and geographic objects in PostgreSQL add
`postgresExtensions = (p: [ p.postgis ]);` to your project's `default.nix` file as
an attribute of the `"{ihp}/NixSupport/default.nix"` expression.
`services.postgres.extensions = extensions: [ extensions.postgis ];` to your project's `flake.nix`:


```nix
let
ihp = builtins.fetchGit {
url = "https://github.com/digitallyinduced/ihp.git";
rev = "c6d40612697bb7905802f23b7753702d33b9e2c1";
};
haskellEnv = import "${ihp}/NixSupport/default.nix" {
ihp = ihp;
postgresExtensions = (p: [ p.postgis ]);
haskellDeps = p: with p; [
cabal-install
base
wai
text
hlint
p.ihp
google-oauth2
];
otherDeps = p: with p; [
];
projectPath = ./.;
{
inputs = {
ihp.url = "path:///Users/marc/digitallyinduced/ihp";
ihp.inputs.nixpkgs.url = "github:mpscholten/nixpkgs/fix-th-desugar";
nixpkgs.follows = "ihp/nixpkgs";
flake-parts.follows = "ihp/flake-parts";
devenv.follows = "ihp/devenv";
systems.follows = "ihp/systems";
};
in
haskellEnv
outputs = inputs@{ ihp, flake-parts, systems, nixpkgs, ... }:
flake-parts.lib.mkFlake { inherit inputs; } {
systems = import systems;
imports = [ ihp.flakeModules.default ];
perSystem = { pkgs, ... }: {
ihp = {
inherit appName;
enable = true;
projectPath = ./.;
packages = with pkgs; [];
haskellPackages = p: with p; [
# ...
];
};
devenv.shells.default = {
services.postgres.extensions = extensions: [ extensions.postgis ];
};
};
};
}
```

Behind the scenes this will pass a function to the postgres nix expressions `postgresql.withPackages`
function making the extension in your app's nix store postgres package.

After the install you can run `create extension postgis;` to enable all the features of the
After the install you can run `CREATE EXTENSION postgis;` to enable all the features of the
installed extension.

### Stopping Nix From Running Tests for a Haskell Dependency
Expand Down
53 changes: 1 addition & 52 deletions Guide/testing.markdown
Original file line number Diff line number Diff line change
Expand Up @@ -303,55 +303,4 @@ For more details on how to structure test suites see the [Hspec manual](http://h

## GitHub Actions

The following GitHub Action workflow can be used to run the tests on CI:

```yaml
# .github/workflows/test.yml

name: Test

# Controls when the workflow will run
on:
# Triggers the workflow on push or pull request events but only for the main branch
push:
branches: [ main ]
pull_request:
branches: [ main ]

# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:

# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
tests:
name: Run Tests
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: cachix/install-nix-action@v27
with:
nix_path: nixpkgs=https://github.com/NixOS/nixpkgs/archive/51bcdc4cdaac48535dabf0ad4642a66774c609ed.tar.gz

# Use the cachix cache for faster builds.
- name: Cachix Init
uses: cachix/cachix-action@v15
with:
name: digitallyinduced
skipPush: true

# Install direnv, which also `direnv allow`s the project.
- uses: HatsuneMiku3939/[email protected]
with:
direnvVersion: 2.32.3

- name: Run project and tests
run: |
# Build generated files.
nix-shell --run "make build/Generated/Types.hs"
# Start the project in the background.
nix-shell --run "devenv up &"
# Execute the tests.
nix-shell --run "runghc $(make print-ghc-extensions) -i. -ibuild -iConfig Test/Main.hs"
```
A GitHub Action workflow can be used to run the tests on CI and do deployments. Consult the [IHP Boilerplate example](https://github.com/digitallyinduced/ihp-boilerplate/blob/master/.github/workflows/test.yml) for more details.
4 changes: 2 additions & 2 deletions Guide/your-first-project.markdown
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ The first time you set up IHP, this command might take 10 - 15 minutes to instal

If you don't already use cachix, you will be prompted to install it. You don't need it, but it is highly recommended, as it reduces build time dramatically. Learn more about cachix [here](https://cachix.org/).

While the build is running, take a look at ["What Is Nix"](https://engineering.shopify.com/blogs/engineering/what-is-nix) by Shopify to get a general understanding of how Nix works.
While the build is running, take a look at ["What Is Nix"](https://shopify.engineering/what-is-nix) by Shopify to get a general understanding of how Nix works.

In case some errors appear now or in later steps:

Expand Down Expand Up @@ -67,7 +67,7 @@ cd blog
Start the development server by running the following in the `blog` directory:

```bash
./start
devenv up
```

Your application is starting now. The development server will automatically launch the built-in IDE.
Expand Down
33 changes: 20 additions & 13 deletions IHP/DataSync/ChangeNotifications.hs
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ createNotificationFunction :: RLS.TableWithRLS -> PG.Query
createNotificationFunction table = [i|
DO $$
BEGIN
CREATE FUNCTION #{functionName}() RETURNS TRIGGER AS $BODY$
CREATE FUNCTION "#{functionName}"() RETURNS TRIGGER AS $BODY$
DECLARE
payload TEXT;
large_pg_notification_id UUID;
Expand Down Expand Up @@ -86,24 +86,31 @@ createNotificationFunction table = [i|
RETURN new;
END;
$BODY$ language plpgsql;
DROP TRIGGER IF EXISTS #{insertTriggerName} ON #{tableName};
DROP TRIGGER IF EXISTS #{updateTriggerName} ON #{tableName};
DROP TRIGGER IF EXISTS #{deleteTriggerName} ON #{tableName};
DROP TRIGGER IF EXISTS "#{insertTriggerName}" ON "#{tableName}";
DROP TRIGGER IF EXISTS "#{updateTriggerName}" ON "#{tableName}";
DROP TRIGGER IF EXISTS "#{deleteTriggerName}" ON "#{tableName}";


CREATE TRIGGER #{insertTriggerName} AFTER INSERT ON "#{tableName}" FOR EACH ROW EXECUTE PROCEDURE #{functionName}();
CREATE TRIGGER #{updateTriggerName} AFTER UPDATE ON "#{tableName}" FOR EACH ROW EXECUTE PROCEDURE #{functionName}();
CREATE TRIGGER #{deleteTriggerName} AFTER DELETE ON "#{tableName}" FOR EACH ROW EXECUTE PROCEDURE #{functionName}();
CREATE TRIGGER "#{insertTriggerName}" AFTER INSERT ON "#{tableName}" FOR EACH ROW EXECUTE PROCEDURE "#{functionName}"();
CREATE TRIGGER "#{updateTriggerName}" AFTER UPDATE ON "#{tableName}" FOR EACH ROW EXECUTE PROCEDURE "#{functionName}"();
CREATE TRIGGER "#{deleteTriggerName}" AFTER DELETE ON "#{tableName}" FOR EACH ROW EXECUTE PROCEDURE "#{functionName}"();
EXCEPTION
WHEN duplicate_function THEN
null;

CREATE UNLOGGED TABLE IF NOT EXISTS large_pg_notifications (
id UUID DEFAULT uuid_generate_v4() PRIMARY KEY NOT NULL,
payload TEXT DEFAULT null,
created_at TIMESTAMP WITH TIME ZONE DEFAULT now() NOT NULL
);
CREATE INDEX IF NOT EXISTS large_pg_notifications_created_at_index ON large_pg_notifications (created_at);
IF NOT EXISTS (
SELECT FROM pg_catalog.pg_class c
JOIN pg_catalog.pg_namespace n ON n.oid = c.relnamespace
WHERE c.relname = 'large_pg_notifications'
AND n.nspname = 'public'
) THEN
CREATE UNLOGGED TABLE large_pg_notifications (
id UUID DEFAULT uuid_generate_v4() PRIMARY KEY NOT NULL,
payload TEXT DEFAULT NULL,
created_at TIMESTAMP WITH TIME ZONE DEFAULT now() NOT NULL
);
CREATE INDEX large_pg_notifications_created_at_index ON large_pg_notifications (created_at);
END IF;
END; $$
|]

Expand Down
3 changes: 2 additions & 1 deletion IHP/DataSync/Controller.hs
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ import IHP.DataSync.RowLevelSecurity
import qualified Database.PostgreSQL.Simple.ToField as PG
import qualified IHP.DataSync.ChangeNotifications as ChangeNotifications
import IHP.DataSync.ControllerImpl (runDataSyncController, cleanupAllSubscriptions)
import IHP.DataSync.DynamicQueryCompiler (camelCaseRenamer)

instance (
PG.ToField (PrimaryKey (GetTableName CurrentUserRecord))
Expand All @@ -21,5 +22,5 @@ instance (
run = do
ensureRLSEnabled <- makeCachedEnsureRLSEnabled
installTableChangeTriggers <- ChangeNotifications.makeCachedInstallTableChangeTriggers
runDataSyncController ensureRLSEnabled installTableChangeTriggers (receiveData @ByteString) sendJSON (\_ _ -> pure ())
runDataSyncController ensureRLSEnabled installTableChangeTriggers (receiveData @ByteString) sendJSON (\_ _ -> pure ()) (\_ -> camelCaseRenamer)
onClose = cleanupAllSubscriptions
Loading

0 comments on commit 58b252d

Please sign in to comment.