diff --git a/.deco/blocks/blogposts.json b/.deco/blocks/blogposts.json index 45b465ef..46642da4 100644 --- a/.deco/blocks/blogposts.json +++ b/.deco/blocks/blogposts.json @@ -1 +1 @@ -{"list":{"posts":[{"body":{"en":{"title":"More on edge async rendering: unveiling new deco.cx capabilities","descr":"Elevating web performance with edge async rendering","content":"## More on edge async rendering: unveiling new [deco.cx](http://deco.cx) capabilities\n\n### Introduction: Elevating web performance with edge async rendering\n\nOver the past few months, we've been working on improving the performance of web pages with our [async render](https://deco.cx/en/blog/async-rendering) paradigm. This approach aimed to reduce latency issues associated with traditional rendering methods, providing faster page responses and a smoother user experience. Today, we're introducing a new milestone in this journey: the \"Stale Edge Cache.\"\n\n### Why edge async rendering matters\n\nAt deco.cx, we understand that performance is crucial. Faster-loading websites lead to better user engagement, higher conversion rates, and improved SEO rankings. Our edge async rendering technique ensures that users experience quick load times, even when third-party APIs are slow or unresponsive. This translates to a smoother, more enjoyable browsing experience for your visitors.\n\n### The Stale Edge Cache: a game changer\n\nThe Stale Edge Cache is designed to enable stale caching for lazy-loaded sections from async render, significantly reducing response times and enhancing load times for a better user experience.\n\n### Simplified workflow with edge caching\n\nBefore the introduction of edge caching, a user's request for an async-rendered section followed a complex path:\n\n1. The browser sends a request to the CDN.\n2. The CDN forwards the request to the origin server.\n3. The origin server processes the request, renders the section, and sends the output back.\n4. This process repeats for every request, resulting in multiple latency points.\n\n![workflow without edge cache](https://ozksgdmyrqcxcwhnbepg.supabase.co/storage/v1/object/public/assets/530/7d1ef95b-dfa5-4e2e-91b9-b97628ee71b6)\n\nWith the Stale Edge Cache, the first server reply of the section is cached in the CDN. Subsequent requests are answered with this cached reply, dramatically reducing the total response time to just the latency between the browser and the CDN plus the content download time.\n\n![workflow with edge cache](https://ozksgdmyrqcxcwhnbepg.supabase.co/storage/v1/object/public/assets/530/6e08c03c-2715-41aa-afad-1887964b11e5)\n\n### The new default: async render with Stale Edge Cache\n\nWe are thrilled to announce that async render with stale edge cache will now be the default setting for all sections. This means that once your site is updated, all newly added sections will automatically benefit from async rendering and edge caching. Each section will have a new \"optimization\" setting in the form, which is enabled by default but can be disabled if needed.\n\n### Minimizing Content Layout Shifts (CLS)\n\nTo ensure a smooth user experience, we highly recommend implementing the **LoadingFallback** component in all sections of your site. This component provides a custom loading state during the async render process, minimizing potential content layout shifts and enhancing visual stability.\n\n### FAQ: Enabling and managing async render with Stale Edge Cache\n\n**Q: What do I need to enable this feature?**\n\n**A:** Simply update the version of `deco` and `apps` dependencies on your site.\n\n**Q: What will happen to my current sections on the pages?**\n\n**A:** Nothing will change automatically, but you can enable async rendering for these sections in the admin panel.\n\n**Q: Can I disable the async render?**\n\n**A:** Yes, you can turn off the async render by disabling the optimization setting in the admin panel.\n\n![admin optimization setting](https://ozksgdmyrqcxcwhnbepg.supabase.co/storage/v1/object/public/assets/530/3d9ee3b7-cce2-47f3-a320-2a72b2e63e2a)\n\n### Conclusion\n\nOur continuous efforts in edge async rendering are aimed at providing you with the best tools to create high-performance websites. The introduction of the Stale Edge Cache marks a significant step forward in this journey, ensuring faster load times and an improved user experience. Update your site today and experience the transformative impact of async render with edge caching.\n\nContribute to this topic in this [github issue](https://github.com/deco-cx/community/issues/13).\n\nFor more details, visit our documentation and explore how these new capabilities can benefit your projects.","seo":{"title":"More on edge async rendering: unveiling new deco.cx capabilities","description":"Introduction: Elevating web performance with edge async rendering","image":"https://ozksgdmyrqcxcwhnbepg.supabase.co/storage/v1/object/public/assets/530/0c77486d-7514-4e44-9794-979dccf63018"}}},"tags":[],"path":"async-render-default","date":"07/16/2024","author":"Igor Brasileiro","img":"https://ozksgdmyrqcxcwhnbepg.supabase.co/storage/v1/object/public/assets/530/0c77486d-7514-4e44-9794-979dccf63018","authorRole":"Software Engineering","authorAvatar":"https://lh3.googleusercontent.com/a/ACg8ocIW-_JDhQKd8vtQBip4uUk_Nj5NUyRdJdalXMzjzJ0fLSk3FNgJ=s96-c"},{"img":"https://ozksgdmyrqcxcwhnbepg.supabase.co/storage/v1/object/public/assets/530/58575cda-d89d-4715-8497-0050fbff26cf","body":{"en":{"descr":"Simplifying event handling with inline scripts","title":"Introducing `useScript`","content":"At deco.cx, we are always making web development more efficient and effective for developers. Today, we are excited to introduce a new utility function: `useScript`. This function is designed to streamline the process of handling events on HTML elements by leveraging the powerful `hx-on:` handlers from HTMX.\n\n### What is `useScript`?\n\n`useScript` is a utility function that takes a JavaScript function and its arguments, and returns a stringified, minified version of the function. This allows developers to inline scripts directly into their HTML with minimal payload, which is especially useful for event handling.\n\n### Why is `useScript` incredibly useful?\n\nWith HTMX, developers can create dynamic, server-rendered HTML pages with ease. However, when it comes to handling client-side events, there’s often a need to include small pieces of JavaScript. This is where `useScript` shines. By using `useScript`, you can add JavaScript only where necessary, avoiding the overhead of a full client-side framework like React.\n\n### Example: Inline script with `hx-on:click`\n\nLet's look at a simple example where `useScript` is used to handle a click event:\n\n```tsx\nimport { useScript } from \"deco/hooks/useScript.ts\";\n\nconst onClick = () ={\n event!.currentTarget.innerText = \"Clicked!\";\n};\n\nfunction ExampleButton() {\n return (\n \n );\n}\n\nexport default ExampleButton;\n```\n\nIn this example, `useScript` takes the `onClick` function and inlines it into the `hx-on:click` attribute, making the button interactive without loading a large JavaScript framework.\n\n### Bridging the gap between server and client\n\n`useScript` offers a unique balance between server-rendered and client-side interactions. By combining the strengths of HTMX for processing large HTML chunks on the server with the ability to add small, targeted JavaScript interactions, `useScript` delivers the best of both worlds. This approach allows developers to build performant, interactive web applications without the need for complex toolsets like React.\n\n### Notes and Limitations\n\nWhile `useScript` is a powerful tool, there are a few things to keep in mind:\n\n1. **No Build Tool**: Since we don’t use a build tool, developers must ensure that their JavaScript functions are compatible with their target browsers. This means keeping your code in sync with your [browserslist](https://browsersl.ist/) target.\n2. **Scope and Dependencies**: The function you pass to `useScript` should not rely on external variables or closures that won’t be available when the script is executed inline. Make sure the function is self-contained and does not depend on external state.\n3. **Attribute Length**: When using `hx-on:` handlers, ensure the minified function does not exceed any attribute length limits imposed by browsers or HTML specifications.\n\n### Conclusion\n\n`useScript` is a valuable addition to our toolkit, enabling developers to add small, targeted JavaScript interactions to their server-rendered HTML. By leveraging the power of HTMX for large chunks of HTML and using `useScript` for small event handlers, you can create efficient, interactive web applications without the overhead of a full client-side framework. Try out `useScript` today and experience the best of both worlds in your web development projects.\n\nFor more details visit the [useScript API reference](https://deco.cx/docs/en/api-reference/use-script).\n\nHappy coding!"}},"date":"06/21/2024","path":"introducing-use-script","tags":[],"author":"Tiago Gimenes","authorRole":"Developer"},{"img":"https://ozksgdmyrqcxcwhnbepg.supabase.co/storage/v1/object/public/assets/530/f3f37747-4e4f-4e00-bd5a-bf54bff0a3ec","body":{"en":{"descr":"How They’re Draining Your Website’s Resources","title":"🚀 A New Era of Bots and Crawlers","content":"![Bots and Crawlers](https://ozksgdmyrqcxcwhnbepg.supabase.co/storage/v1/object/public/assets/530/f3f37747-4e4f-4e00-bd5a-bf54bff0a3ec)\n\nYour website’s traffic doesn’t just come from human visitors; bots play a significant role too. **Search engines, social media platforms, and even AI systems deploy automated tools (robots, or 'bots') to crawl your site, extracting content and valuable information.** To rank well on Google, for example, your content must be well-structured, with clear titles, readable text, and highly relevant information. This entire analysis is conducted by bots crawling your site!\n\nBut here’s the catch: **Every time a bot crawls your site, it’s not “free.”** Each request made by a bot consumes resources—whether it's computational power or bandwidth. Most major bots respect a special file (`robots.txt`) that tells them which parts of your site they can or cannot access. As a site owner, you can control which bots are allowed to crawl your site.\n\n```\nUser-agent: *\nAllow: /\n```\n\n_A simple rule that allows all bots to access all pages._\n\nLet’s look at the impact this can have. \n\nIn May, across various Deco sites, bots were responsible for **over 50% of the bandwidth consumed**, even though they didn’t make up the majority of the requests.\n\n![Bots Bandwidth Consumption](https://ozksgdmyrqcxcwhnbepg.supabase.co/storage/v1/object/public/assets/530/54a19e86-09b8-420d-a392-67ef33a4af93)\n\nDespite accounting for less than 20% of traffic, **bots often consume significantly more bandwidth due to the specific pages they access.** They tend to spend more time on larger pages, such as category pages in online stores, which naturally load more data. These pages often feature product variations and filters, making them even more data-heavy.\n\nWhile Google’s bot respects the `nofollow` attribute, which prevents links from being crawled, not all bots do. \nThis means that pages with filter variations also need a `noindex` meta tag or a more specialized `robots.txt` configuration.\n\n## AI: The New Gold Rush\n\nAI is changing the game when it comes to data extraction, release, and value. \n\nThe demand for massive amounts of data for processing has led to the creation of more bots, particularly crawlers operating on the web. **Data is more valuable than ever, yet there’s no guarantee of immediate returns for those who hand over their data to third parties.** The third-largest consumer of bandwidth (Amazonbot) and several others (Ahrefs, Semrush, Bing) are known as “good bots.” **These verified bots respect the `robots.txt` file, allowing you to control how and what they access.** A possible configuration for managing these bots is shown below:\n\n```\nUser-agent: googlebot\nUser-agent: bingbot\nAllow: /\nDisallow: /search\n\nUser-agent: *\nAllow: /$\nDisallow: /\n```\n\n_This allows Google and Bing bots to crawl your site, except for search pages, while restricting all other bots to the site’s root._\n\nThis setup grants broad access to valuable, known bots but limits overly aggressive crawling of all your site’s pages. However, notice how the second-highest bandwidth consumer is ClaudeBot—an **AI bot notorious for consuming large amounts of data while disregarding the `robots.txt` file.** In this new AI-driven world, we’re seeing more of these kinds of bots.\n\nAt deco.cx, we offer a standard `robots.txt` similar to the example above for our sites, but for bots that don’t respect this standard, the only way to control access is through blocking them at the CDN (in our case, Cloudflare). At Deco, we use three approaches to block these bots:\n\n- **Block by User-Agent**: Bots that ignore `robots.txt` but have a consistent user-agent can be blocked directly at our CDN.\n\n- **Challenge by ASN**: Some bots, especially malicious ones, come from networks known for such attacks. We place a challenge on these networks, making it difficult for machines to solve.\n\n- **Limit Requests by IP**: After a certain number of requests from a single origin, we present a challenge that users must solve correctly or face a temporary block.\n\nThese rules have effectively controlled most bots…\n\n…except Facebook.\n\n## “Facebook, Are You Okay?”\n\nWe’ve discussed bots that respect `robots.txt`. And then there’s Facebook.\n\nJust before Facebook’s new privacy policy went into effect—allowing user data to be used for AI training—we noticed a significant spike in the behavior of Facebook’s bot on our networks. This resulted in a substantial increase in data consumption, as shown in the graph below.\n\n[More details on Facebook’s new privacy policy](https://www.gov.br/anpd/pt-br/assuntos/noticias/anpd-determina-suspensao-cautelar-do-tratamento-de-dados-pessoais-para-treinamento-da-ia-da-meta)\n\n![Traffic June 2024](https://ozksgdmyrqcxcwhnbepg.supabase.co/storage/v1/object/public/assets/530/206499a0-a2b2-4d49-9df0-c8f037a16101)\n\n_Aggregate data traffic for a set of sites in June 2024._\n\nThe Facebook bot typically fetches data when a link is shared on the platform, including details about the image and site information. However, we discovered that the bot wasn’t just fetching this data—it was performing a full crawl of sites, aggressively and without respecting `robots.txt`!\n\nMoreover, Facebook uses various IPv6 addresses, meaning the crawl doesn’t come from a single or a few IPs, making it difficult to block with our existing controls. \nWe didn’t want to block Facebook entirely, as this would disrupt sharing, but we also didn’t want to allow their bots to consume excessive resources. To address this, we implemented more specific control rules, limiting access across Facebook’s entire network…\n\n![Traffic July 2024](https://ozksgdmyrqcxcwhnbepg.supabase.co/storage/v1/object/public/assets/530/9cdc728c-af5e-44dc-802b-6da8550cb207)\n_Aggregate data traffic for a set of sites in July 2024._\n\n…which proved to be highly effective.\n\n## Blocking Too Much Could Hurt Your Presence in Emerging Bots or Technologies\n\nA final word of caution: adopting an overly aggressive approach has its downsides. \nRestricting access to unknown bots might prevent new technologies and tools that could benefit your site from interacting with it. For example, a new AI that could recommend specific products to visitors might be inadvertently blocked. \nIt’s crucial to strike a balance, allowing selective bot access in line with market evolution and your business needs.\n\nIn summary, bots and crawlers are valuable allies, but managing their access requires strategic thinking. \nThe key is to allow only beneficial bots to interact with your site while staying alert to new technologies that might emerge. This balanced approach will ensure that your business maximizes return on traffic and resource consumption."}},"date":"08/15/2024","path":"bots","tags":[],"author":"Matheus Gaudêncio"},{"img":"https://ozksgdmyrqcxcwhnbepg.supabase.co/storage/v1/object/public/assets/530/05ed5d60-22c1-459a-8323-53d3b2b3b3d9","body":{"en":{"descr":"How to refactor common design patterns from React into Native Web","title":"Leveraging native web APIs to reduce bundle size and improve performance","content":"\n\nIn modern web development, libraries like React have become the norm for building dynamic and interactive user interfaces. However, relying heavily on such libraries can sometimes lead to larger bundle sizes and reduced performance. By leveraging native web APIs, we can accomplish common design patterns more efficiently, enhancing performance and overall web compatibility. In this article, we will explore how to use these APIs to reduce our dependence on React hooks like `useState`.\n\n### Case Study: The Hidden `` Hack\n\nOne often overlooked technique in native web development is the hidden `` hack. This trick, which has been around for a while, offers a way to control UI state without relying on JavaScript, reducing bundle size and potentially improving performance.\n\n#### The React Approach\n\nLet’s start with a typical example in React:\n\n![hello](https://github.com/deco-cx/community/assets/1753396/6baf8c80-e11a-48fd-a611-cdf7405bfec2)\n\n\n```tsx\nimport { useState } from \"preact/hooks\";\n\nexport default function Section() {\n const [display, setDisplay] = useState(false);\n\n return (\n
\n \n {display &&
Hello!
}\n
\n );\n}\n```\n\nIn this example, the `useState` hook and the `onClick` handler are used to toggle the visibility of a piece of UI. While this approach is effective, it involves additional JavaScript, which can contribute to a larger bundle size.\n\n#### The Native Web API Approach\n\nNow, let’s refactor this example to use only native web APIs:\n\n```tsx\nexport default function Section() {\n return (\n
\n \n \n
Hello!
\n
\n );\n}\n```\n\n#### Explanation\n\nSo, what’s happening here? We’ve replaced the `useState` hook with a hidden `` element. Here’s a breakdown of the changes:\n\n1. **Hidden Checkbox**: We introduce an `` with an `id` and the `hidden` class. This checkbox will control the state.\n2. **Label for Toggle**: The `