Replies: 8 comments 15 replies
-
Perfect implementation, thanks for sharing!
Hence the name (
That’s right. As the response is a stream of server-sent events, the Datastar process each fragment in real-time as it is sent to the browser. So in theory, if you send a regular fragment followed by a long-running fragment (one that hits a third-party API, for example), the first fragment will be processed on the front-end as soon as it is received, and the long-running fragment will follow whenever it completes rendering. The {# Settles the fragment after 500ms – useful for transitions. %}
{% fragment with { settle: 500 } %}
...
{% endfragment %} |
Beta Was this translation helpful? Give feedback.
-
Being forced to revisit an old project, where such a search (the result list as dropdown) was implemented as an Alpine JS component: Would love to see how frontend keyboard/mouse events would be handled in Spark, such as
Supporting those things were the main reason to use Alpine only (may be it would have been possible to build that on top of Sprig, but I didn't want to mix two frameworks..) |
Beta Was this translation helpful? Give feedback.
-
@wsydney76 it sounds like all of these things would be handled at the Datastar level. There's an active channel in the htmx Discord server, in case you run into any issues: https://discord.gg/JUKb6RpJ |
Beta Was this translation helpful? Give feedback.
-
@wsydney76 check out: https://datastar.fly.dev/guide/batteries_included
Datastar takes a lot of inspiration from Alpine. |
Beta Was this translation helpful? Give feedback.
-
@bencroker you say:
...but it is rendering JavaScript, unless I'm misunderstanding something. If I do <button
id="dialogs"
data-store="{prompt:'foo',confirm:false}"
data-fetch-url=""
data-on-click="$prompt=prompt('Enter a string',$prompt);$confirm=confirm('Are you sure?');$confirm && $$get('/examples/dialogs___browser/sure')"
>
Click Me
</button> So you should be able to do something like this: <button
id="dialogs"
data-store="{prompt:'foo',confirm:false}"
data-fetch-url=""
data-on-click="$prompt=prompt('Enter a string',$prompt);$confirm=confirm('Are you sure?');$confirm && {{ spark.get('/examples/dialogs___browser/sure.twig') }}"
>
Click Me
</button> It looks weird because of the |
Beta Was this translation helpful? Give feedback.
-
Related: Proposed Spark API changes -> #5 |
Beta Was this translation helpful? Give feedback.
-
Thanks, Datastar author here. Any questions don't hesitate to ask! |
Beta Was this translation helpful? Give feedback.
-
Just a note about the revised API (as of about an hour ago!), for anyone recreating @khalwat 's search example above: The index template would look something like this: <section data-store="{ search: ''}">
<h3>Spark search</h3>
<input
data-model="search"
data-on-input.debounce_500ms="$$get('{{ sparkUrl('spark/_fragments/search-results.twig', {}, true) }}')"
placeholder="Search..."
type="text"
/>
<div id="search-results">
</div>
</section> And the template <div id="search-results">
{% set search = store.get('search') ?? '' %}
{% set results = [] %}
{% if search is not empty %}
{% set results = craft.entries()
.section(['demo'])
.limit(9)
.search(search)
.orderBy('score')
.all() %}
{% endif %}
<ul>
{% for result in results %}
<li><a href="{{ result.url }}">{{ result.title }}</a></li>
{% else %}
<p>No results</p>
{% endfor %}
</ul>
</div> |
Beta Was this translation helpful? Give feedback.
-
EDIT: Updated the code below to conform to the Spark
0.0.7
API changes (thank you, Ben!)So to play with Spark, I decided to implement the same auto-complete search that I use on the nystudio107.com website for finding articles.
The existing nystudio107.com implementation works using an Autocomplete Vue component on the frontend, which pings an Element API endpoint.
Here's what the HTML/Vue code looks like:
...and here's what the Element API endpoint looks like:
So then I decided to try to implement this via Spark/Datastar. Here's what the index page looks like:
What this does is it says the
<input>
element's value should have a two-way binding with thesearch
item in our data store, and any time text is input into the field, it should render thespark/_fragments/search-results.twig
template, and replace the<div id="search-results>
DOM element with the one that is returned by rendering the aforementioned Twig template.The
.debounce_500ms
just tells it to delay rendering the template until nothing additional has changed in the<input>
element for500ms
... to prevent slamming the backend with aget
requests....and here's what the Twig code looks like for the fragment that it loads via Spark:
This is mostly just standard Twig. The only two things to note are that the
search
Twig variable is automatically set, because it's in our store... and the{% fragment %}
tag causes Spark to render out an "Event" that includes some instructions for what Datastar should do with it, as well as the rendered HTML....and here's what it looks like in the browser (yes, it's ugly, I didn't format anything):
spark-autocomplete-demo.mov
If you're wondering what the
{# @var store \attributes\stores\AutocompleteSearch #}
is, I'm leveraging PHP 8 Attributes to add a class that describes thestore
data structure so I can get nice auto-complete in PHPstorm:It's really easy, you just create a little stub class and away you go:
Overall, I think the Spark/Datastar implementation seems a whole lot easier, and it's nice to be able to work with Twig the way you're used to working with Twig.
It didn't take too long to figure out the concepts needed to make it work, and it felt pretty good to use.
I'm still not extremely well-versed with how Datastar works, but the main things I needed to learn when using Spark with it are:
data-store
property on an HTML element in the main template's DOM to hold your state for use by Datastar/Sparkdata-
so it's just leveraging HTML data attributesdata-store
is available in your Twig templates that are rendered by Spark, as regular old Twig variablesid
s to figure out what parts of the DOM to replace when Spark renders a template{% do spark.setStore({ }) %}
data-on-
lets you do things when various JavaScript events happen; it's just JavaScript that you put in there. Spark outputs the JavaScript you need when you use things like{{ spark.get() }}
{% fragment %}
tells Spark that what is in that tag pair is an HTML fragment that should be sent back. A template can have as many fragments as you like, so multiple DOM elements can be individually replaced with one requestUnrelated to Spark (because it generates this for you), but relevant to Datastar if you're looking at how it works under the hood:
$
symbol is used to prefix items in your store, so for instance$search
references thesearch
item in your store$$
symbols are used to prefix various baked-in Datastar functions like$$get()
or$$put()
Like I said, just scratching the surface, but pretty cool!
Beta Was this translation helpful? Give feedback.
All reactions