Skip to content

Source code for barrenechea.cl, where the website's construction takes place

Notifications You must be signed in to change notification settings

barrenechea/barrenechea-website

Repository files navigation

Barrenechea Logo

Barrenechea Website Repository

Source code for barrenechea.cl, where the website's construction takes place.

Overview

Welcome to the official repository for the Barrenechea website. This site is crafted with modern web technologies to ensure a fast, responsive, and visually appealing experience for all visitors.

Key Features

  • Static Site Generation: Powered by Astro, enhancing performance and SEO.
  • Styling: Leveraging Tailwind CSS for scalable and maintainable design.
  • Typography: Featuring Work Sans for clean and professional text presentation.
  • Internationalization (i18n): Offering multi-language support with automated translation powered by Large Language Models (LLMs).
  • SEO Optimization: Equipped with sitemap, RSS feed, and OpenGraph images to improve search engine visibility.

Getting Started

To contribute to the development of the website or set up a local version, follow the steps below:

Prerequisites

Ensure you have Node.js >=20 installed on your machine.

Installation

  1. Clone the repository and navigate to its directory:

    git clone https://github.com/barrenechea/barrenechea-website.git
    cd barrenechea-website
  2. Install project dependencies:

    npm install

Development Server

To start the local development server:

npm start

Visit http://localhost:3000 in your browser to view the site.

Gitpod Development

Alternatively, launch a ready-to-code dev environment with Gitpod:

Open in Gitpod

Internationalization (i18n) Content Automation

For detecting and translating content automatically:

npm run i18n:generate

Before running the command, set the OPENAI_API_KEY environment variable with your OpenAI API key for translation services.

For custom AI API endpoints, use the OPENAI_BASE_URL environment variable. I'm currently experimenting with the llama-cpp-python server as an alternative.