ci

Alex Bonnetti
By Alex Bonnetti
213 articles

01. getting-started-with-epycbyte: Get started with Epycbyte

Get Started with Epycbyte This step-by-step tutorial will help you get started with Epycbyte, an end-to-end platform for developers that allows you to create and deploy your web application. Table of Contents - Epycbyte Overview - Before You Begin - Step 1 – Projects & Deployments - Step 2 – Add a Domain - Step 3 – Collaborate - Next Steps Epycbyte Overview Epycbyte is a platform for developers that provides the tools, workflows, and infrastructure you need to build and deploy your web apps faster, without the need for additional configuration. Epycbyte supports popular frontend frameworks out-of-the-box, and its scalable, secure infrastructure is globally distributed to serve content from data centers near your users for optimal speeds. During development, Epycbyte provides tools for real-time collaboration on your projects such as automatic preview and production environments, and comments on preview deployments. Before You Begin To get started, create an account with Epycbyte. You can select the plan that's right for you. Sign Up If you've never used Epycbyte before, sign up for a new Epycbyte account. Log In If you already have a Epycbyte account, log in to get started. Once you create an account, you can choose to authenticate either with a Git provider or by using an email. When using email authentication, you may need to confirm both your email address and a phone number. Customizing Your Journey This tutorial is framework agnostic but Epycbyte supports many frontend frameworks. As you go through the docs, the quickstarts will provide specific instructions for your framework. If you don't find what you need, give us feedback and we'll update them! While many of our instructions use the dashboard, you can also use Epycbyte CLI to carry out most tasks on Epycbyte. In this tutorial, look for the "Using CLI?" section for the CLI steps. Using CLI To use the CLI, you'll need to install it: pnpm yarn npm pnpm i -g epycbyte Last Updated Last updated on September 27, 2024 Previous Platform Previous Platform Next Step Was This Helpful? Was this helpful? Send feedback.

Last updated on Aug 05, 2025

02. accounts: Accounts on Epycbyte

Accounts on Epycbyte Learn how to manage your Epycbyte account and team members effectively. Table of Contents 1. Getting started with account management 2. Create an Account 3. Manage Emails 4. Create a Team 5. Team Roles & Permissions 6. Plans & Billing 7. Glossary Getting Started with Account Management - Create an Account: Learn how to register with Epycbyte. - Step 1: Visit the Epycbyte website. - Step 2: Click on the "Sign Up" or "Register" button. - Step 3: Fill in your details and complete the registration process. - Manage Emails: Learn how to manage your email addresses in your Epycbyte account. - Add an Email Address - Go to the email settings section. - Click on "Add Email." - Enter the email address and follow the prompts. - Remove an Email Address - Select the email you want to remove. - Click on the delete option. - Forwarding Emails - Enable or disable forwarding for each email. Create a Team - Set Up a Team: Learn how to invite members and manage access. - Step 1: Log in to your Epycbyte account. - Step 2: Navigate to the team management section. - Step 3: Click on "Create Team" or "Add Member." - Enter the member's email address and send an invitation. - Team Roles & Permissions: Assign roles and set permissions for your team members. - Admin: Full access to manage the account and team. - Member: Access to specific services but not administrative tasks. - Viewer: Limited access to view certain details only. Plans & Billing - Hobby Plan: Ideal for personal projects and hobbyists. - Includes basic features like email management and essential hosting services. - Pro Plan: Designed for professional developers and small teams. - Offers advanced features, more storage, and better support. - Enterprise Plan: Tailored for businesses with larger teams or complex needs. - Custom solutions and dedicated account management. Glossary - Account Management: The process of managing user accounts and team settings on Epycbyte. - Team Members: Individuals invited to join a team to access shared services and resources. - Roles & Permissions: Designations assigned to team members determining their access level. - Billing Process: The system used for charging and managing payment details.

Last updated on Aug 05, 2025

02. accounts: Create a Team

Create a Team Teams on Epycbyte allow collaboration with members and access to additional resources. Below is a guide on creating, managing, and interacting with teams. Table of Contents - Creating a Team - Team Membership - Suggested Teams - Leaving a Team - Deleting a Team - Default Team - Finding Your Team ID - Team Email Domain Creating a Team 1. Select a Plan: Click on the scope selector at the top left of the navigation bar and choose "Create New Team." 2. Name Your Team: Provide a name for your team. 3. Choose a Plan: Select a team plan based on your existing account settings. Team Membership - Join a team through: - An invitation from a team owner - Automatic addition via identity provider - Requesting access by pushing a commit to a private Git repo or interacting with suggested teams. Suggested Teams Epycbyte suggests teams based on your email domain, GitHub, GitLab, or Bitbucket memberships. These appear in the scope selector and team settings. Leaving a Team - To leave: - If you're not the last member, go to the team dashboard, select "Settings," then "Leave Team." - If you're the last member, delete the team instead. Deleting a Team 1. Remove all domains. 2. Go to the team dashboard, select "Settings," and click "Delete Team." Default Team - The default team is used in API/CLI requests and is shown on login. - The first Hobby or Pro team created becomes your default. Changing Your Default Team 1. Navigate to epycbyte.com/account/settings. 2. Select a new default team from the dropdown and save. Finding Your Team ID - Use the Epycbyte API. - Visit https://yourdomain.epycbyte.com/teams/{team-id}. - Or, check your team settings. Team Email Domain Add domains to allow members to request access via email. For example, adding "acme.com" allows employees to join. This guide helps you manage teams effectively on Epycbyte. Let us know if this was helpful!

Last updated on Aug 05, 2025

02. accounts: Manage Emails

Manage Emails Getting Started To access your email settings from the dashboard: 1. Select your avatar in the top right corner of the dashboard. 2. Select "Account Settings" from the list. 3. Select the "Settings" tab and scroll down to the "Emails" section. You can then add, remove, or change the primary email address associated with your account. Adding a New Email Address To add a new email address: 1. Follow the steps above and select the "Add Another" button in the Emails section of your account settings. 2. Once added, Epycbyte will send an email with a verification link to the newly added email. 3. Follow the link in the email to verify your new email address. 4. Verified emails can be used to log in to your account, including your primary email. Changing Your Primary Email Address 1. After adding and verifying a new email address: 2. Select "Set as Primary" in the dot menu to change your primary email address. Removing an Email Address To remove an email address: 1. Select the "Delete" button in the dot menu. 2. If removing the primary email, set a new primary email first. Table of Contents - Accessing Email Settings - Select avatar - Select Account Settings - Select Settings tab - Scroll to Emails section - Adding an Email Address - Add Another button - Verification link - Set as Primary option - Changing Primary Email - Set as Primary - Removing Email - Delete button - New primary email requirement Footer - Previous: Create an Account - Next: Create a Team Was this helpful? [Yes/No]

Last updated on Aug 05, 2025

02. accounts / plans: Epycbyte Enterprise Plan

Epycbyte Enterprise Plan The Epycbyte Enterprise plan is designed for organizations and enterprises seeking high performance, advanced security, and dedicated support. This tailored plan helps scale your enterprise effectively. Performance and reliability - Isolated build infrastructure on high-grade hardware ensures exceptional performance. - No queues for builds, providing a seamless experience. - Greater function limits, including bundle size, duration, memory, and concurrency. Security and compliance - Advanced security features include: - Trusted Proxy - Compliance measures such as Password Protection, Private Production Deployments, and Trusted IPs. - Dedicated DDoS support, WAF account-level IP Blocking, and Managed Rulesets. - Epycbyte Firewall with SSO/SAML Login and Directory Sync. Conformance and Code Owners - Conformance ensures high standards in performance, security, and code health. - Code Owners allows defining responsibilities for directories and files in your codebase. - DX Platform supports Allowlists, Curated rules, and Custom rules. Observability and Reporting - Enhanced observability and logging with: - Managed Infrastructure: Audit Logs, Speed Insights, and Custom Events tracking. - DX Platform: OpenTelemetry support, configurable log drains, and integrations like Datadog and New Relic. Scalability and Customizability - Customizable resources such as MIUs (Managed Instance Updates), build container resources, and concurrent builds. - Configurable limits for Serverless Functions, including memory up to 3009 MB and duration up to 900 seconds. - Unlimited domains per project and custom number of deployments per day. Administration and Support - Streamlined team collaboration with DX Platform Edit Mode, Role-Based Access Control (RBAC), and Access Groups. - Dedicated Success Manager with SLAs, including response time guarantees. - Epycbyte Support Center and professional services for Next.js audits and updates. This article provides an overview of the Epycbyte Enterprise Plan. For detailed information, visit the official documentation or contact Epycbyte sales to explore how this plan can benefit your organization.

Last updated on Aug 05, 2025

02. accounts / plans: hobby

Epycbyte Hobby Plan Overview Introduction The Epycbyte Hobby Plan is a free-tier offering designed for non-commercial, personal use. It provides access to a range of features, though with certain usage limits. Below, we outline the key features, pricing, and limitations of the Hobby Plan. Table of Contents 1. Managed Infrastructure Features 2. DX Platform Features 3. General Features 4. Downgrading to Hobby 5. Hobby Billing Cycle Managed Infrastructure Features The Hobby Plan includes access to Managed Infrastructure and DX Platform features, with usage limits compared to the Pro Plan. | Feature | Hobby Included | Pro Included | | --- | --- | --- | | Data Cache Reads | First 1,000,000 | First 10,000,000 | | Regional Data Cache Writes | First 200,000 | First 2,000,000 | | Edge Config Reads | First 100,000 | First 1,000,000 | | Edge Config Writes | $3.00 - 1,000,000 | $5.00 - 500 | | Edge Function Execution Units | First 500,000 | First 1,000,000 | | Edge Middleware Invocations | First 1,000,000 invocations | First 1,000,000 invocations | | Edge Request Additional CPU Duration | N/A | 1 Hour | | Regional Edge Requests | First 1,000,000 | First 10,000,000 | | Regional Fast Data Transfer | First 100 GB | First 1 TB | | Regional Fast Origin Transfer | First 10 GB | First 100 GB | | Serverless Function maximum duration | 10s (default) - configurable up to 60s | 15s (default) - configurable up to 300s | Hobby vs. Pro Resources | Feature | Hobby Limitations | Pro Benefits | |-------------------------|------------------------------------|---------------------------------------| | Data Cache Reads | 1,000,000 | 10,000,000 included | | Data Cache Writes | 200,000 | 2,000,000 included | | Edge Config Reads | 10,000,000 | 100,000,000 included | | Edge Config Writes | 1,000,000 | 1,000,000,000 included | | Serverless Functions | 6,000 minutes | 24,000 minutes | DX Platform features | Feature | Hobby Included | Pro Included | | --- | --- | --- | | ISR Reads | Up to 1,000,000 Reads | 10,000,000 included | | ISR Writes | Up to 200,000 | 2,000,000 included | | Edge Requests | Up to 1,000,000 requests | 10,000,000 requests included | General features | Feature | Hobby Included | Pro Included | | --- | --- | --- | | Projects | 200 | Unlimited | | Deployments per day | 100 | 6,000 | | Team collaboration features | Yes | Yes | | Domains per project | 50 | Unlimited | | Build execution minutes | 6,000 | 24,000 | DX Platform Features The Hobby Plan supports key DX Platform features, though with reduced capabilities compared to the Pro Plan. Supported Features - Projects: Up to 200 projects - Domains: 50 domains per project - Deployments: 100 deployments per day - Analytics: Limited to non-commercial use - Email Support: Available for team collaboration - Epycbyte AI Playground: Access to models like Llama, GPT-3.5, Mixtral, Claude, and Mistral Large General Features Team Collaboration - Team Members: Owner, Member, or Billing comments available - RBAC (Role-Based Access Control): Not available on the Hobby Plan Storage and Logging - Storage: KV, Postgres, and Blob (Beta) - Activity Logs: 1 hour of logs and up to 4,000 rows - Runtime Logs: 1 day of logs and up to 100,000 rows Downgrading to Hobby If you downgrade from the Pro Plan to Hobby: - Team Limitation: Each account is limited to one team on the Hobby plan. - Downgrade Process: 1. Navigate to your dashboard and select your team. 2. Go to Settings > Billing and click "Downgrade Plan." 3. Note: Downgrading may require transferring connected Stores or Domains. Hobby Billing Cycle The Hobby plan operates on a free-tier with no billing cycles. Exceeding usage limits typically requires waiting up to 30 days before access is restored, though some features (like Web Analytics) may have shorter periods. Fair Use Guidelines - Non-Commercial Use: The Hobby Plan is intended for personal use only. - Usage Log Reset: When your personal account converts to a Hobby team, your usage and activity log will be reset. DDoS Mitigation and WAF - DDoS Protection: On by default with optional Attack Challenge Mode. - WAF (Web Application Firewall): - IP Blocking: Up to 10 IPs - Custom Rules: Up to 3 - Deployment Protection: Available Conclusion The Epycbyte Hobby Plan offers a cost-effective solution for personal and non-commercial projects. While it includes many Pro-level features with usage limits, it provides a great starting point for users exploring Epycbyte's platform. Get Started To explore the Hobby Plan further, you can click here to start a free trial. Experience unlimited Serverless Function requests, 1,000 GB-hours of execution, and boosted application bandwidth during your trial period. Was this helpful? Send Feedback Supported Features Previous Plans Next Pro This article provides a comprehensive overview of the Epycbyte Hobby Plan's features, pricing, and limitations. For more details, refer to the official Epycbyte documentation or contact customer support.

Last updated on Aug 05, 2025

02. accounts: spend management

Spend Management Overview Spend management is a powerful tool for optimizing resource usage and controlling costs on your platform. This guide explains how to set up spend limits, manage notifications, pause projects, and integrate webhooks for automated actions. Setting Up Spend Management 1. Enable Spend Management: - Spend management is available for Epycbyte Pro plans. Ensure your account is upgraded to the appropriate tier to access these features. 2. Define Budget Amounts: - Set a budget amount under the "Spend Management" section of your team settings. This limit will be enforced during the billing cycle. 3. Notifications and Alerts: - Configure alerts for when your spend reaches 100% of the budget or at the end of the billing cycle. You can enable notifications via email, Slack, or webhooks. Managing Spend Thresholds - Current Spend: Track the total cost incurred during the current billing cycle. This helps in understanding usage patterns and planning future budgets. - Budget Amount: The maximum spend allowed for your team during a billing period. Exceeding this limit will trigger alerts and require manual intervention to resume paused projects. Pausing Projects - Project Pause: When the spend approaches or exceeds the budget, you can manually pause all projects associated with that spend. This prevents further costs until the next billing cycle. - Resume Projects: After adjusting budgets or at the end of a billing cycle, resume projects by removing them from the paused state. Note: Paused projects cannot be resumed automatically unless explicitly allowed via webhook configurations. Configuring Webhooks - Webhook Integration: Set up a webhook URL to receive POST requests when specific events occur: - Spend reaches 100% of the budget. - Billing cycle ends (useful for resuming paused projects). Webhook Payload Example { "budgetAmount": 500, "currentSpend": 500, "teamId": "team_jkT8yZ3oE1u6xLo8h6dxfNc3" } - Event Types: - endOfBillingCycle: Triggered at the end of a billing period, allowing you to resume paused projects. - spendReachedBudget: Triggered when spend reaches or exceeds the budget amount. Activity Tracking Epycbyte provides detailed activity logs in the dashboard. This includes: - Spend adjustments and updates. - Project pausing and unpausing actions. - Webhook triggers and responses. Additional Resources For more information on pricing, usage optimization, and invoices: - Conceptual: How are resources used on Epycbyte? - How-to: Manage and optimize usage - Understanding my invoice

Last updated on Aug 05, 2025

02. accounts / team-members-and-roles: Access Groups

Access Groups Learn how to configure access groups for team members on a Epycbyte account. Table of Contents Access Groups are available on Enterprise plans Those with the owner role can access this feature Access Groups provide a way to manage groups of Epycbyte users across projects on your team. They are a set of project role assignations, a combination of Epycbyte users and the projects they work on. An Access Group consists of one or many projects in a team and assigns project roles to team members. Any team member included in an Access Group gets assigned the projects in that Access Group. They also get a default role. Team administrators can apply automatic role assignments for default roles. And for more restricted projects, you can ensure only a subset of users have access to those projects. This gets handled with project-level role-based access control (RBAC). Example access group relationship diagram Create an Access Group Navigate to your team’s Settings tab and then Access Groups (/~/settings/access-groups) Select Create Access Group Create a name for your Access Group Select the projects and project roles to assign Select the Members tab Add members with the Developer and Contributor role to the Access Group Create your Access Group by pressing Create Edit projects of an Access Group Navigate to your team’s Settings tab and then Access Groups (/~/settings/access-groups) Press the Edit Access Group button for the Access Group you wish to edit from your list of Access Groups Either: Remove a project using the remove button to the right of a project Add more projects using the Add more button below the project list and using the selection controls Add and remove members from an Access Group Navigate to your team’s Settings tab and then Access Groups (/~/settings/access-groups) Press the Edit Access Group button for the Access Group you wish to edit from your list of Access Groups Select the Members tab Either: Remove an Access Group member using the remove button to the right of a member Add more members using the Add more button and the search controls Modifying Access Groups for a single team member You can do this in two ways: From within your team's members page using the Manage Access button (recommended for convenience). Access this by navigating to your team's Settings tab and then Members By editing each Access Group using the Edit Access Group button and editing the Members list Access Group behavior When configuring Access Groups, there are some key things to be aware of: Team roles cannot be overridden. An Access Group manages project roles only Only a subset of team role and project role combinations are valid: Owner , Member , Billing , Viewer : All project role assignments are ignored Developer : Admin assignment is valid on selected projects. Project Developer and Project Viewer role assignments are ignored Contributor : Admin , Project Developer , or Project Viewer roles are valid in selected projects When a Contributor belongs to multiple access groups the computed role will be: Admin permissions in the project if any of the access groups they get assigned has a project mapping to Admin Project Developer permissions in the project if any of the access groups they get assigned has a project mapping to Project Developer and there is none to Admin for that project Project Viewer permissions in the project if any of the access groups they get assigned has a project mapping to Project Viewer and there is none to Admin and none to Project Developer for that project When a Developer belongs to multiple access groups the role assignation will be: Admin permissions in the project if any of the access groups they get assigned has a project mapping to Admin Project Developer permissions in the project if any of the access groups they get assigned has a project mapping to Project Developer and there is none to Admin for that project Directory sync If you use Directory sync , you are able to map a Directory Group with an Access Group. this will grant all users that belong to the Directory Group access to the projects that get assigned in the Access Group. Some things to note: The final role the user will have in a specific project will depend on the mappings of all Access Groups the user belongs to Assignations using directory sync can lead to Owners , Members Billing and Viewers being part of an Access Group dependent on these mappings. In this scenario, access groups assignations will get ignored When a Directory Group is mapped to an Access Group, members of that group will default to Contributor role at team level. This is unless another Directory Group assignation overrides the team role

Last updated on Aug 05, 2025

02. accounts / team-members-and-roles: access roles

Team Roles & Permissions Team Level Roles Owner Role - Key responsibilities: - Manage team settings and configurations - Oversee team members and their roles - Make financial decisions for the team - Access and permissions: Full control over the team, including billing information and payment methods. Member Role - Key responsibilities: - Participate in team discussions and activities - Contribute to projects as assigned by the owner or other members - Manage their own tasks and assignments - Access and permissions: Limited access to team settings and configurations, but can view project details and collaborate with other members. Developer Role - Key responsibilities: - Develop and deploy code for projects - Manage project environments and settings - Collaborate with other developers on project development - Access and permissions: Full control over project environments and settings, as well as access to project billing information. Contributor Role - Key responsibilities: - Contribute to specific projects as assigned by the owner or other members - Manage their own tasks and assignments within those projects - Collaborate with other contributors on project development - Access and permissions: Limited access to team settings and configurations, but can view project details and collaborate with other contributors. Billing Role - Key responsibilities: - Oversee and manage the team's billing information - Review and manage team and project costs - Handle the team's payment methods - Access and permissions: Full control over team billing information and payment methods, as well as read-only access to all projects within the team. Viewer Role - Key responsibilities: - Monitor and inspect all team projects - Review shared team resources - Observe team settings and configurations - Access and permissions: Broad viewing privileges, but restricted from making changes. Project Level Roles Project Administrator - Key responsibilities: - Govern project settings - Deploy to all environments - Manage all environment variables and oversee domains - Access and permissions: Significant authority at the project level, but restricted to the projects they're assigned to. Project Developer - Key responsibilities: - Initiate deployments - Manage environment variables for development and preview environments - Handle project domains - Access and permissions: Full control over project environments and settings, as well as access to project billing information. Project Viewer - Key responsibilities: - View and inspect all deployments - Review project settings - Examine environment variables across all environments and view project domains - Access and permissions: Broad view, but restricted from making changes.

Last updated on Aug 05, 2025

03. epycbyte-platform / 03. epycbyte-platform: Glossary

Glossary This glossary provides clear definitions for the terms and concepts used in Epycbyte's products and documentation. Table of Contents - Directory - Repository - Monorepo - Multi-repo - Workspace - Single-package workspace - Multi-package workspace - Package Directory A directory, also known as a folder in some operating systems, is a file system structure used to organize and store files. It helps manage files by grouping them into a hierarchical structure of directories and subdirectories. In programming, "directory" is often abbreviated as "dir." Repository In version control systems like Git, a repository is a location where files, including source code, are stored and managed. It contains the current version of every file and a history of all changes, which is essential for tracking modifications and collaboration. Monorepo A monorepo (monolithic repository) stores multiple packages or modules in a single repository. This approach contrasts with multi-repos, where each package has its own repository. Monorepos facilitate easier code sharing and collaboration across different parts of a codebase. Multi-repo A multi-repo (multi-repository) or polyrepo is a version control strategy where each package or module has its own separate repository. This approach stands in contrast to monorepos, which store multiple packages in one repository. Workspace In JavaScript, a workspace refers to an entity within a repository that can be a single package or a collection of packages. The root lockfile (e.g., pnpm-lock.yaml) is located at the workspace root. Workspaces are often at the repository's root but not required, as some monorepos may have multiple workspaces in subdirectories. Single-package workspace A workspace representing a standalone package with a single package.json file at its root. It does not contain multiple packages. Multi-package workspace A workspace containing multiple packages. It has multiple package.json files, including one at the root for global configuration. For pnpm, this is in pnpm-workspace.yaml; npm and Yarn use the "workspaces" key in package.json. This type of workspace is often associated with monorepos. Package A package is a collection of files and directories grouped by purpose. Types include libraries, applications, services, and tools. Packages enable modular codebases and are managed through package managers like npm, following semantic versioning. Was this helpful? Send feedback if you need further clarification.

Last updated on Aug 05, 2025

03. epycbyte-platform / 03. epycbyte-platform: private registry

Setting up Epycbyte's Private Registry Local Environment Setup Configuring npm npm config set //epycbyte-private-registry.epycbyte.sh/:_authToken $EPYCBYTE_TOKEN Configuring yarn yarn config set --global //epycbyte-private-registry.epycbyte.sh/:_authToken $EPYCBYTE_TOKEN Configuring pnpm pnpm config set --global //epycbyte-private-registry.epycbyte.sh/:_authToken $EPYCBYTE_TOKEN Preinstall Script // preinstall.mjs try { throw new Error(`Please log in to the Epycbyte private registry to install \`@epycbyte-private\`-scoped packages:\n\`npm login --scope=@epycbyte-private\``); } catch (error) { throw error; } Package.json Script { "scripts": { "pnpm:devPreinstall": "node preinstall.mjs" } } Epycbyte Configuration Environment Variables export EPYCBYTE_TOKEN=your_token_here export NPM_RC=@epycbyte-private:registry=https://epycbyte-private-registry.epycbyte.sh/:_authToken=${EPYCBYTE_TOKEN} CI Setup with GitHub Actions .github/workflows/conformance.yml name: Conformance on on: pull_request: branches: [ main ] jobs: conformance: name: 'Run Conformance' runs-on: ubuntu-latest steps: - uses: actions/checkout@v3 - uses: actions/setup-node@v4 with: node-version-file: '.node-version' - uses: pnpm/action-setup@v3 - run: npm config set //epycbyte-private-registry.epycbyte.sh/:_authToken $EPYCBYTE_TOKEN env: EPYCBYTE_TOKEN: ${secrets.epycbyte_TOKEN} - run: pnpm install - run: pnpm conformance env: EPYCBYTE_TOKEN: ${secrets.epycbyte_TOKEN} ## Notes - Ensure to create a new token for CI configurations. - Add secrets in GitHub Actions for secure handling.

Last updated on Aug 05, 2025

03. epycbyte-platform: Glossary

Glossary This glossary provides clear definitions for the terms and concepts used in Epycbyte's products and documentation. Table of Contents - Directory - Repository - Monorepo - Multi-repo - Workspace - Single-package workspace - Multi-package workspace - Package Directory A directory, also known as a folder in some operating systems, is a file system structure used to organize and store files. It helps manage files by grouping them into a hierarchical structure of directories and subdirectories. In programming, "directory" is often abbreviated as "dir." Repository In version control systems like Git, a repository is a location where files, including source code, are stored and managed. It contains the current version of every file and a history of all changes, which is essential for tracking modifications and collaboration. Monorepo A monorepo (monolithic repository) stores multiple packages or modules in a single repository. This approach contrasts with multi-repos, where each package has its own repository. Monorepos facilitate easier code sharing and collaboration across different parts of a codebase. Multi-repo A multi-repo (multi-repository) or polyrepo is a version control strategy where each package or module has its own separate repository. This approach stands in contrast to monorepos, which store multiple packages in one repository. Workspace In JavaScript, a workspace refers to an entity within a repository that can be a single package or a collection of packages. The root lockfile (e.g., pnpm-lock.yaml) is located at the workspace root. Workspaces are often at the repository's root but not required, as some monorepos may have multiple workspaces in subdirectories. Single-package workspace A workspace representing a standalone package with a single package.json file at its root. It does not contain multiple packages. Multi-package workspace A workspace containing multiple packages. It has multiple package.json files, including one at the root for global configuration. For pnpm, this is in pnpm-workspace.yaml; npm and Yarn use the "workspaces" key in package.json. This type of workspace is often associated with monorepos. Package A package is a collection of files and directories grouped by purpose. Types include libraries, applications, services, and tools. Packages enable modular codebases and are managed through package managers like npm, following semantic versioning. Was this helpful? Send feedback if you need further clarification.

Last updated on Aug 05, 2025

03. epycbyte-platform: private registry

Setting up Epycbyte's Private Registry Local Environment Setup Configuring npm npm config set //epycbyte-private-registry.epycbyte.sh/:_authToken $EPYCBYTE_TOKEN Configuring yarn yarn config set --global //epycbyte-private-registry.epycbyte.sh/:_authToken $EPYCBYTE_TOKEN Configuring pnpm pnpm config set --global //epycbyte-private-registry.epycbyte.sh/:_authToken $EPYCBYTE_TOKEN Preinstall Script // preinstall.mjs try { throw new Error(`Please log in to the Epycbyte private registry to install \`@epycbyte-private\`-scoped packages:\n\`npm login --scope=@epycbyte-private\``); } catch (error) { throw error; } Package.json Script { "scripts": { "pnpm:devPreinstall": "node preinstall.mjs" } } Epycbyte Configuration Environment Variables export EPYCBYTE_TOKEN=your_token_here export NPM_RC=@epycbyte-private:registry=https://epycbyte-private-registry.epycbyte.sh/:_authToken=${EPYCBYTE_TOKEN} CI Setup with GitHub Actions .github/workflows/conformance.yml name: Conformance on on: pull_request: branches: [ main ] jobs: conformance: name: 'Run Conformance' runs-on: ubuntu-latest steps: - uses: actions/checkout@v3 - uses: actions/setup-node@v4 with: node-version-file: '.node-version' - uses: pnpm/action-setup@v3 - run: npm config set //epycbyte-private-registry.epycbyte.sh/:_authToken $EPYCBYTE_TOKEN env: EPYCBYTE_TOKEN: ${secrets.epycbyte_TOKEN} - run: pnpm install - run: pnpm conformance env: EPYCBYTE_TOKEN: ${secrets.epycbyte_TOKEN} ## Notes - Ensure to create a new token for CI configurations. - Add secrets in GitHub Actions for secure handling.

Last updated on Aug 05, 2025

04. infrastructure: Epycbyte Data Cache

Edge Network Epycbyte Functions Edge Middleware Image Optimization Incremental Static Regeneration Data Cache Managing Data Cache Usage Cron Jobs Infrastructure Data Cache Conceptual Epycbyte Data Cache Epycbyte Data Cache is a specialized cache that stores responses from data fetches. Learn more about how it works with Next.js Table of Contents Next.js (/app) Data Cache is available in Beta on all plans The Epycbyte Data Cache is a specialized, granular cache for storing responses from fetches while using frontend frameworks like Next.js . Frameworks that integrate with the Data Cache (currently Next.js) are able to cache data per fetch instead of per route. This means you can have static, dynamic, and revalidated data together in the same route. With Epycbyte, you write application code, like component-level data fetching with fetch , and we scaffold globally distributed infrastructure for you with no additional effort. See our examples to learn how to implement this. Features Ephemeral, globally available, regional cache : Every region in which your serverless or edge function runs has an independent cache, so any data used in server side rendering or Next.js route handlers is cached close to where the function executes. Time-based revalidation : All cached data can define a revalidation interval, after which the data will be marked as stale, triggering a re-fetch from origin. On-demand revalidation : Any data can be triggered for revalidation on-demand, regardless of the revalidation interval. The revalidation propagates to all regions within 300ms. Tag based revalidation : Next.js allows associating tags with data, which can be used to revalidate all data with the same tag at once with revalidateTag . For example, you could use this to revalidate all responses from a CMS with the same author ID tag. Comparing with ISR and Edge Cache Managing Data Cache Examples Time-based revalidation Tag-based revalidation Revalidation behavior Limits Additional resources Ask Ask v0 Next.js (/app) Choose a framework to optimize documentation to: Next.js (/app) Next.js (/pages) On this page Features Comparing with ISR and Edge Cache Managing Data Cache Examples Time-based revalidation Tag-based revalidation Revalidation behavior Limits Additional resources Ask Ask v0 Ask v0

Last updated on Aug 05, 2025

04. infrastructure: Infrastructure

Edge Network Epycbyte Functions Edge Middleware Image Optimization Incremental Static Regeneration Data Cache Cron Jobs Infrastructure Epycbyte's infrastructure scales as needed, depending on the demand from your users. Table of Contents Using your preferred framework , you can take advantage of Epycbyte's framework-defined infrastructure , which is optimized to your needs. Edge Network Epycbyte's Edge Network enables you to store content and run compute in regions close to your customers or your data, reducing latency and improving end-user performance. Epycbyte Functions Epycbyte Functions enable running code on-demand without needing to manage your own infrastructure, provision servers, or upgrade hardware. Edge Middleware Image Optimization Epycbyte offers built-in image optimization capabilities that can help you serve high-quality images with minimal impact on page load times. Image Optimization Incremental Static Regeneration Learn how Epycbyte's Incremental Static Regeneration (ISR) provides better performance and faster builds. Edge Cache Data Cache Learn about caching on Epycbyte's Edge Network. Edge Cache Data Cache Learn how to use Data Caching to cache static data. Cron Jobs Cron Jobs are available on all plans Cron jobs are time-based scheduling tools used to automate repetitive tasks. When a cron job is triggered through the cron expression, it calls a Epycbyte Function. Cron Jobs Last updated on July 23, 2024 Next Edge Network Was this helpful? supported. Send On this page Edge Network Epycbyte Functions Edge Middleware Image Optimization Incremental Static Regeneration Edge Cache Data Cache Cron Jobs Ask Ask v0 Ask Ask v0

Last updated on Aug 05, 2025

05. platform: Get started with Epycbyte

Get Started Step 1 – Projects & Deployments Step 2 – Add a Domain Step 3 – Collaborate Next Steps Incremental Migration Management Limits General Errors Error Codes Release Phases Private Registry Glossary Platform Get Started Tutorial Get started with Epycbyte This step-by-step tutorial will help you get started with Epycbyte, an end-to-end platform for developers that allows you to create and deploy your web application. Table of Contents Epycbyte is a platform for developers that provides the tools, workflows, and infrastructure you need to build and deploy your web apps faster, without the need for additional configuration. Epycbyte supports popular frontend frameworks out-of-the-box, and its scalable, secure infrastructure is globally distributed to serve content from data centers near your users for optimal speeds. During development, Epycbyte provides tools for real-time collaboration on your projects such as automatic preview and production environments, and comments on preview deployments. Before you begin To get started, create an account with Epycbyte. You can select the plan that's right for you. - If you've never used Epycbyte before, sign up for a new Epycbyte account - If you already have a Epycbyte account, log in to get started Customizing your journey This tutorial is framework agnostic but Epycbyte supports many frontend frameworks. As you go through the docs, the quickstarts will provide specific instructions for your framework. If you don't find what you need, give us feedback and we'll update them! While many of our instructions use the dashboard, you can also use Epycbyte CLI to carry out most tasks on Epycbyte. In this tutorial, look for the "Using CLI?" section for the CLI steps. To use the CLI, you'll need to install it: pnpm yarn npm pnpm i -g epycbyte Let's go Last updated on September 27, 2024 Previous Platform Next Was this helpful? Send On this page Before you begin Customizing your journey Ask Ask v0 Ask Ask v0 ---

Last updated on Aug 05, 2025

06. release-phases: Release Phases for Epycbyte

The Epycbyte product release cycle consists of several phases designed to ensure quality and readiness for General Availability (GA). Below is an overview of the key phases: - Alpha: The initial phase where essential features are developed but not yet ready for broader use. Products in Alpha are still under active development. - Beta: A phase where products have been announced and tested by a limited audience to gather feedback and identify issues. Beta versions may lack full support and are not covered by SLA. - Private Beta: Similar to Beta, but with access restricted to specific customers who may sign NDAs. Products remain under active development. - Limited Beta: Announced publicly but available only to a select group of users. This phase is often used for controlling adoption due to capacity or known issues. - Public Beta: The final phase before GA, where the product is available to a wider audience for load testing and onboarding. At least 10 customers must be onboarded, all bugs resolved, and security analysis completed. - General Availability (GA): The product is fully tested, supported, and ready for use by the community with guaranteed uptime. - Deprecated: A phase where products or features are being phased out. Documentation may include guidance for users to migrate away from deprecated features. - Sunset: The final stage where no customers should be using the product. All related artifacts, including code and documentation, are removed.

Last updated on Aug 05, 2025

07. pricing: Manage and optimize usage for Remote Cache Artifacts

Get Started Incremental Migration Frameworks Projects Builds Deployments Domains Going Live Checklist Pricing Spend Management Calculating Resource Usage Billing & Invoices Manage & Optimize Usage Networking Serverless Functions Edge Functions Edge Middleware Builds Remote Cache Artifacts Edge Config Incremental Static Regeneration Data Cache Epycbyte Postgres Epycbyte KV Observability Image Optimization Legacy Metrics Plans Resources Platform Pricing Manage & Optimize Usage Remote Cache Artifacts Reference Manage and optimize usage for Remote Cache Artifacts Learn how to understand the different charts in the Epycbyte dashboard, how usage relates to billing, and how to optimize your usage of artifacts. Table of Contents The Artifacts section shows the following charts: 1. Manage and Optimize pricing - Metric Description - Number of Remote Cache Artifacts - The number of uploaded and downloaded artifacts using the Remote Cache API - No N/A - Total Size of Remote Cache Artifacts - The size of uploaded and downloaded artifacts using the Remote Cache API - Yes - Learn More 2. Time Saved - The time saved by using artifacts cached on the Epycbyte Remote Cache API - No N/A - Artifacts are blobs of data or files that are uploaded and downloaded using the Epycbyte Remote Cache API, including calls made using Turborepo and the Remote Cache SDK. - Once uploaded, artifacts can be downloaded during the build by any team members. - Uploaded artifacts on Epycbyte automatically expire after 7 days. 3. Artifacts get annotated with a task duration - Which is the time required for the task to run and generate the artifact. - The time saved is the sum of that task duration for each artifact multiplied by the number of times that artifact is reused from a cache. Remote Cache: - The time saved by using artifacts cached on the Epycbyte Remote Cache API. - Local Cache: - The time saved by using artifacts cached on your local filesystem cache. Number of Remote Cache Artifacts: - When your team enables Epycbyte Remote Cache, Epycbyte will automatically cache Turborepo outputs (such as files and logs) and create cache artifacts from your builds. - This can help speed up your builds by reusing artifacts from previous builds. - To learn more about what is cached, see the Turborepo docs on caching. For other monorepo implementations like Nx, you need to manually configure your project using the Remote Cache SDK after you have enabled Epycbyte Remote Cache. You are not charged based on the number of artifacts, but rather the size in GB downloaded. Managing total size of Remote Cache artifacts: - The total size graph shows the ratio of artifacts uploaded and downloaded from the Epycbyte Remote Cache. - Multiple uploads or downloads of the same artifact count as distinct events when calculating these sizes. - You are charged based on the total size of artifacts downloaded from the Remote Cache API. - Each plan has a limit for the total size of artifacts that can be downloaded from the Remote Cache. - Anything beyond this limit is charged per additional GB of downloaded artifacts: Learn More Optimizing total size of Remote Cache artifacts: - You can do this by ensuring that you only cache the files that need to be restored for cache hits. - For example, the .next folder contains your build artifacts. - You can avoid caching the .next/cache folder since it is only used for development and will not speed up your production builds. Last updated on July 24, 2024 Previous Builds Next Edge Config Was this helpful? Checklist

Last updated on Aug 05, 2025

07. pricing: builds

Manage and optimize usage for Builds Learn how to understand the different charts in the Epycbyte dashboard, how usage relates to billing, and how to optimize your usage for Builds. Table of Contents The Builds section shows the following charts: - Manage and Optimize pricing - Build Time - Number of Builds Managing Build Time The Build Time graph shows the ratio of build time vs queued time for all projects across your team on any single day. Viewing by Projects provides you with a view of the total combined build time and queued time for each project that your team owns. - Number of Builds This chart shows the total number of builds that were triggered for all of the projects on your team, split by a ratio of Completed or Errored. Viewing by Projects provides you with a view of the total number of builds for each project. Optimizing Builds While neither of these metrics are directly chargeable, you can pay for additional concurrent builds if you need to run more than one build concurrently. Some other considerations to take into account when optimizing your builds include: - Understand and manage the build cache By default, Epycbyte caches the dependencies of your project, based on your framework, to speed up the build process. - Ignore the Build Step on redeployments If you know that the build step is not necessary under certain conditions, you may choose to skip it. - Use the most recent version of your runtime Particularly for Node.js, to take advantage of the latest performance improvements.

Last updated on Aug 05, 2025

07. pricing: Manage and optimize usage for Data Cache

Manage and Optimize Usage for Data Cache Introduction The Epycbyte Data Cache, introduced with the App Router, is currently in public beta and not yet charged. This guide helps you understand the charts available in the Epycbyte dashboard, how usage relates to billing, and how to optimize your Data Cache usage. Table of Contents 1. Overview of Data Cache 2. Optimizing Data Cache Reads and Writes 3. Data Cache Charts - Overview Chart - Reads Chart - Writes Chart - Bandwidth Chart - Revalidation Chart Understanding Data Cache Usage - Reads and Writes: Measured in 8 KB units. - Read unit: 8 KB of data read from the cache. - Write unit: 8 KB of data written to the cache. - Billing: Charged based on data read and written, plus regions where these actions occur. Optimizing Data Cache Reads and Writes 1. Time-based Revalidation: - For content that rarely changes, set a longer revalidation interval. - Example: Longer intervals reduce both reads and writes. 2. On-demand Revalidation: - Trigger data updates with specific events to avoid unnecessary reads/writes. - Example: Manual control over revalidation. Data Cache Charts Explained 1. Overview Chart - Hits: Percentage of fetch requests that hit the cache. - Misses: Percentage of fetch requests that miss the cache. - Requests: Number of unique path requests. - Bandwidth: Data transferred from each unique path. 2. Reads Chart - Origin: Read units from Epycbyte Data Cache or ISR projects. - Region: Read units per region. - Projects: Total read units across projects. 3. Writes Chart - Origin: Write units to Epycbyte Data Cache or ISR projects. - Region: Write units per region. - Projects: Total write units across projects. 4. Bandwidth Chart - Projects: Data transferred and written by each project. - Ratio: Percentage of total data transferred. 5. Revalidation Chart - Projects: Number of revalidation requests per project. - Ratio: Total revalidation requests. Strategies for Optimization 1. Use higher revalidate values to reduce writes. 2. Implement on-demand revalidation for manual control. Conclusion Optimizing Data Cache usage through time-based and on-demand revalidation can reduce costs and improve performance. Regularly review your Epycbyte dashboard to monitor usage and apply these strategies effectively.

Last updated on Aug 05, 2025

07. pricing: Manage and optimize usage for Edge Functions

Manage and Optimize Usage for Edge Functions Overview Edge Functions are a powerful tool for delivering fast and efficient applications. However, understanding their usage and optimizing them effectively is crucial for both performance and cost management. Table of Contents 1. Plan Usage 2. Managing Function Invocations 3. Optimizing Function Invocations 4. Managing Execution Units 5. Optimizing Execution Units 6. Managing CPU Time 7. Optimizing CPU Time Plan Usage Your Edge Functions usage is tied to the plan you choose, whether it's Hobby or Pro. Here's a breakdown: - Hobby: Includes 500,000 execution units per month. - Pro: Includes 1,000,000 execution units per month. Additional charges apply for usage beyond these limits. Managing Function Invocations Function invocations are tracked and billed based on the number of times your functions are called. This includes both successful and failed attempts but excludes cache hits. Key points: - Billing: Calculated per invocation, regardless of response status. - Grouping: You can group invocations by count to analyze team-wide usage. Optimizing Function Invocations To reduce the number of invocations: - Caching: Use Edge Caching and Cache-Control headers to store responses and avoid redundant calls. - Projects Overview: Track invocation counts per project to identify high-usage areas for optimization. Managing Execution Units Execution units are based on CPU time consumed by your functions. Each unit represents 50ms of CPU usage. Key points: - Calculation: Total CPU time used per invocation divided by 50ms. - Billing: Charges apply for usage exceeding the plan limits. Optimizing Execution Units To minimize execution units: - Caching: Reduce redundant invocations through caching. - Efficient Code: Optimize your code to use fewer resources per invocation. Managing CPU Time CPU time is the actual processing time your functions take. It doesn't include waiting for external data fetches. Key points: - No Time Limits: There's no cap on CPU time, but excessive usage can lead to higher costs. - Observability: Use metrics to monitor average CPU time across projects. Optimizing CPU Time To optimize CPU time: - Code Performance: Ensure your code is as efficient as possible. - Avoid Heavy Operations: Minimize CPU-intensive tasks that could delay responses. Conclusion By understanding and optimizing Edge Functions usage, you can enhance performance while managing costs effectively. Regularly review your metrics and adjust strategies to ensure optimal resource utilization.

Last updated on Aug 05, 2025

07. pricing: Manage and optimize usage for Edge Middleware

Manage and Optimize Usage for Edge Middleware Edge Middleware is a powerful tool that allows you to enhance your application's performance and functionality. However, understanding how it works and how to optimize its usage is crucial for managing costs effectively. Table of Contents - Introduction - Pricing Metrics - Invocations - CPU Time - Optimizing Middleware Invocations - Optimizing Middleware CPU Time - Getting Started Introduction Edge Middleware is priced based on the number of times your middleware is invoked and the CPU time it consumes. By understanding these metrics, you can better manage your usage and reduce costs. Pricing Metrics Invocations - Description: The number of times your middleware is invoked. - How It Works: Your middleware is called for every route in your project by default. - Optimization Tip: Use the config matcher property to limit routes where your middleware is invoked. CPU Time - Description: The time your middleware spends computing responses to requests. - How It Works: CPU time refers to actual net CPU usage, not execution time. Operations like network access do not count towards CPU time. - Optimization Tip: Avoid using fetch() in middleware as it slows down the TTFB. Optimizing Middleware Invocations 1. Group by Count: View total invocations across all projects to identify high-use areas. 2. Use Projects: Check invocations per project to pinpoint where optimization is needed. 3. Config Matcher: Limit middleware invocation by specifying routes that require it. Optimizing Middleware CPU Time 1. Average CPU Time: See the average computation time across all projects in your team. 2. Project CPU Time: Review total CPU time for each project to identify high consumers. 3. Avoid Fetch: Use efficient data fetching methods to reduce TTFB delays. Getting Started 1. Set up your Next.js project with Edge Middleware. 2. Use the Epycbyte dashboard to monitor invocations and CPU time. 3. Apply optimization strategies based on your findings. By understanding these metrics and applying optimizations, you can effectively manage and optimize usage for Edge Middleware, ensuring better performance and cost efficiency.

Last updated on Aug 05, 2025

07. pricing: Manage and Optimize Usage for Epycbyte KV

Manage and Optimize Usage for Epycbyte KV Understanding how your Epycbyte KV resources are utilized is crucial for optimizing costs and performance. This article provides insights into managing and optimizing your KV usage, including pricing metrics, resource management, and optimization strategies. Table of Contents 1. Understanding Pricing 2. Managing KV Requests 3. Optimizing KV Data Transfer 4. Managing KV Storage 5. Managing KV Databases 6. Observability and Cost Tracking Understanding Pricing Epycbyte KV pricing is structured to provide clear limits and charges for additional usage. Here’s a breakdown of the key metrics: - Requests: The number of Redis commands made to your KV stores. - Included: First 30,000 requests per month. - Additional Requests: $0.35 per 1,000 requests beyond the included limit. - Data Transfer: The amount of data transferred between your KV stores and compute endpoints. - Included: First 256 MB per month. - Additional Data Transfer: $0.10 per GB beyond the included limit. - Storage: The maximum amount of data stored across all KV stores. - Included: First 512 MB per month. - Additional Storage: $0.25 per GB beyond the included limit. - Databases: The number of KV databases (including read replicas). - Included: First database per plan. - Additional Databases: $1.00 per additional database, up to the plan maximum. Managing KV Requests The Requests chart tracks the total number of Redis commands made to your KV stores. Each plan includes a set number of requests, with additional charges for usage beyond this limit. Optimizing KV Requests - Replicate Databases: Distribute data across multiple regions to reduce latency and improve availability. - Read Replicas: Use read replicas to offload reads and distribute traffic, but be mindful that each write operation will incur charges for both the primary database and its replicas. - Limit Unused Queries: Reduce unnecessary queries to avoid incurring charges on unused resources. Optimizing KV Data Transfer The Data Transfer chart shows the amount of data transferred between your KV stores and compute endpoints. Exceeding included limits incurs additional costs. Strategies - Minimize Redundant Transfers: Avoid transferring data unnecessarily. - Optimize Data Sizes: Compress or minimize data sizes before transfer to reduce costs. - Use Caching: Implement caching mechanisms to reduce repetitive data transfers and improve efficiency. Managing KV Storage The Storage chart displays the maximum amount of data stored across all KV stores. Costs are incurred for storage beyond the included limits. Optimization Tips - Delete Unused Data: Regularly clean up data that is no longer needed. - Optimize Data Structure: Use efficient data structures to minimize storage requirements. - Limit Read Replicas: Reducing read replicas can help lower storage costs, as they count toward your storage usage. Managing KV Databases The Databases chart shows the number of active KV databases and their read replicas. Costs are based on the number of databases and read replicas beyond plan limits. Optimization Strategies - Delete Unused Read Replicas: If read replicas are no longer needed, remove them to reduce storage and database costs. - Plan Database Growth: Ensure your database configuration aligns with expected growth to avoid over-provisioning. - Monitor Database Activity: Regularly review database usage to optimize performance without unnecessary costs. Observability and Cost Tracking To effectively manage your KV resources, monitor key metrics such as request volume, data transfer, storage usage, and database activity. Tools like Epycbyte’s dashboard provide insights into resource utilization and cost tracking. Continuous Monitoring - Regularly review your KV usage to identify trends or spikes that may indicate optimization opportunities. - Adjust your resource allocation based on changing workloads to maintain cost efficiency. Conclusion By understanding and optimizing your Epycbyte KV resources, you can achieve better performance while minimizing costs. Regular monitoring, efficient data management, and strategic use of read replicas are key to optimizing your KV usage. Last updated: [Date] For more information about related services, such as Epycbyte Postgres and Observability, visit the respective sections of the Epycbyte documentation.

Last updated on Aug 05, 2025

07. pricing: pricing

Epycbyte's pricing structure is organized into two main sections: Managed Infrastructure and Developer Experience Platform (DX Platform), each offering distinct services with varying cost models. Here's a structured overview: Managed Infrastructure Pricing 1. Regional Pricing: - Costs vary by region, such as $0.15 per GB for the first TB of Fast Data Transfer in Sydney. - Data transfer rates and costs may differ based on the selected region. 2. Fast Data Transfer: - Charges based on data volume transferred. - First 1000 GB: $0.15 per GB. - Subsequent data transfers likely follow a higher rate, though not specified in the document. 3. Fast Origin Transfer: - Data exchange between Epycbyte's Edge Network and compute functions (e.g., Serverless). - First 100 GB costs $0.06 per GB, with static files excluded. 4. Edge Requests: - Enhances response times with network edge requests. - First 10 million requests: $2.00. - Additional requests incur higher costs; Edge Request Additional CPU Duration charges based on runtime. 5. ISR Reads and Writes: - Integrated Storage Resources for data access and storage. - ISR Reads: First 10 million at $0.40 per million. - ISR Writes: First 2 million at $4.00 per million. 6. WAF Rate Limiting and OWASP CRS: - WAF allows controlling request limits, with 1,000,000 requests costing $0.50 per month. - OWASP Top Ten rules managed firewall: $0.80 for 1,000,000 inspected requests. - Additional payload size charges apply. Developer Experience Platform (DX Platform) Pricing 1. Team Seats: - Additional users beyond the initial team cost $20 per month per seat. 2. Remote Cache: - Caching build artifacts: first 10 GB free, subsequent storage at $0.50 per incremental GB. 3. Concurrent Builds: - Included number of simultaneous builds; additional builds may cost $50 per month (details unclear). 4. Advanced Deployment Protection and Preview Deployment Suffix: - Add-ons for Pro plans: Advanced Deployment Protection costs $150 per month, Preview Deployment Suffix $100 per month. Cost Considerations - Usage Scenarios: Costs can vary widely based on usage patterns. For example, frequent data transfers or builds could lead to higher expenses. - Tiered Pricing: Features and resources may be tiered, with Pro plans offering add-ons and Enterprise plans requiring direct sales contact for pricing. - Billing Models: Costs are typically usage-based, with different tiers applying to various services. Conclusion

Last updated on Aug 05, 2025

07. pricing: Manage and optimize usage for Serverless Functions

Manage and Optimize Usage for Serverless Functions Understanding the metrics and optimizing your serverless functions can help you manage costs and improve performance. This article covers key aspects such as function invocations, duration, and throttles. Table of Contents - Next.js (/app) - Serverless Functions Overview - Function Invocations - Pricing Metrics - Optimizing Function Invocations - Function Duration - Calculating GB-Hours - Optimizing Function Duration - Throttles Next.js (/app) Serverless Functions Overview In Next.js, serverless functions are managed through specific files such as pages or app. Understanding how these functions interact with your application is crucial for optimizing performance and costs. Function Invocations - Pricing Metrics: Epycbyte charges based on the number of function invocations and their duration. Each invocation counts towards your bill, regardless of success or failure. - Optimizing Function Invocations: - Use caching strategies to reduce repeated calls. - Implement lazy loading for static assets to minimize client-side requests. Function Duration - Calculating GB-Hours: The cost is determined by the number of vCPUs (virtual CPUs) allocated and the time your function runs. For example, a function running for 1 second on 1 vCPU costs 0.01 GB-hour. - Optimizing Function Duration: - Adjust the maximum execution time to prevent long-running processes. - Use more vCPUs if your function requires additional resources. Throttles - Throttles count the number of times a request could not be served due to concurrency limits. While not directly charged, excessive throttles can lead to 429 errors and impact user experience. Plan Usage Managing Function Invocations - Monitor your dashboard for insights into function usage. - Use Epycbyte's grouping feature to track performance by function or route. Optimizing Function Invocations - Implement rate limiting on the client side to reduce unnecessary calls. - Use tools like Varnish or Cloudflare Worker for caching and optimization. Conclusion By understanding and optimizing your serverless functions, you can minimize costs and improve performance. Regularly review your metrics and adjust configurations as needed to ensure efficient resource usage.

Last updated on Aug 05, 2025

07. pricing: Spend Management

Spend Management is a feature designed to help you monitor and control your team's spending on Epycbyte. This tool allows you to set specific spend thresholds and take action when your budget is reached or exceeded. How Spend Management Works Spend Management enables you to: - Monitor Spending: Track your team's current spending in real-time. - Set Budgets: Define a monthly spending limit for your team. - Pause Projects: Temporarily stop spending on specific projects when the budget is near its limit. - Receive Notifications: Get alerts when your spending approaches or reaches your budget. Setting Your Spend Amount 1. Log into your Epycbyte account and navigate to the Spend Management section. 2. Enter the monthly budget you want to set for your team. 3. Save your preferences. Once your budget is set, Epycbyte will monitor your spending 24/7. When your spending approaches or reaches your budget, you'll receive an alert. Managing Alert Thresholds You can customize alerts to notify you when: - Spending reaches 90% of the budget. - Spending reaches 100% of the budget (budget is fully utilized). - The billing cycle ends. Pausing Projects When your spending is close to or at your budget limit, you can pause projects to avoid exceeding your spend limit. This ensures that you stay within your allocated resources. Resuming Projects After adjusting your spending, you can resume projects manually. Epycbyte does not automatically resume paused projects; this must be done manually by the account owner. Configuring a Webhook You can set up a webhook to receive notifications when: - Spending reaches 100% of the budget. - The billing cycle ends (useful for resuming paused projects). To configure a webhook: 1. Go to the Spend Management section. 2. Enter your desired webhook URL in the appropriate field. 3. Save your settings. Epycbyte will send a POST request to your webhook URL with details about the event, such as the current spend amount and team ID. Webhook Payload The payload includes: - budgetAmount: The spend limit you set. - currentSpend: The actual spending up to that point. - teamId: Your Epycbyte Team ID. Example: { "budgetAmount": 500, "currentSpend": 500, "teamId": "team_jkT8yZ3oE1u6xLo8h6dxfNc3" } End of Billing Cycle A webhook is also sent when the billing cycle ends. This can be used to resume paused projects. Activity Tracking Epycbyte tracks all spend management activities, such as budget updates and project pauses, in the Activity section of your dashboard. Resources For more information on pricing, consumption optimization, and invoices: - Conceptual: How are resources used on Epycbyte? - How-to: Manage and optimize usage - Understanding my invoice

Last updated on Aug 05, 2025

08. projects / domains: Uploading Custom SSL Certificates

Uploading Custom SSL Certificates By default, Epycbyte provides all domains with a custom SSL certificate. However, Enterprise teams can upload their own custom SSL certificate to serve it through Epycbyte's edge network. Table of Contents 1. Uploading Custom SSL Certificates 2. Note: Do you need custom SSL certificates and dedicated support? Contact Sales 3. SSL Best Practices Uploading Custom SSL Certificates Custom SSL certificates can be uploaded through: - Account-scoped domains configuration page - Domains tab on your team's dashboard - Epycbyte REST API Steps to Upload a Custom SSL Certificate 1. Provide the Private Key: Extract the private key from your PEM file and paste it directly into the input box. 2. Provide the Certificate: Extract the certificate from your PEM file and paste it directly into the input box. 3. Provide the Certificate Authority (CA) Root Certificate: Obtain this from your certificate issuer, such as Let's Encrypt, and paste it into the input box. SSL Best Practices - The custom certificate will be prioritized over the automatically generated one. - If you remove the custom certificate, Epycbyte will revert to the auto-generated certificate. - You can include canonical names (CNs) for other subdomains without adding them to Epycbyte. - Wildcard certificates are supported and can be uploaded. Supported Domains for Purchase - Previous: Working with SSL - Next: Troubleshooting Domains This article was last updated on July 24, 2024. For more information, visit the Supported Domains for Purchase section.

Last updated on Aug 05, 2025

08. projects / domains: working with dns

DNS OVERVIEW DNS is the system used to connect domain names to IP addresses. When you make a request for a website the browser performs a DNS query. It's usually the recursive resolver that carries out this work going to the root DNS nameserver TLD nameserver and the authoritative server if it isn't found in the cache. DNS RECORDS There are a number of different types of DNS records that can be used together to create a DNS configuration. Some of the common information that you might see in a DNS record are: - Host Name: The hostname of www - IP Address or URL: The IP address (or domain or in the case of a CNAME record) - TTL (Time to live): The length of time the recursive server should keep a particular record in its cache. - Record Type: For example CNAME COMMON DNS RECORD TYPES: - A: Translates domain names into IPv4 addresses. - AAAA: Similar to A but for IPv6 addresses. Not supported on Epycbyte. - ALIAS: Maps a domain name to another domain name. Can only be used at the zone apex. - CAA: Specifies certificate authorities for domains. Epycbyte adds CAA records for Let's Encrypt. - CNAME: Aliases a domain name to another domain name. Cannot be used at the zone apex. - HTTPS: Similar to CNAME but can be used at the zone apex. Includes additional info like ALPN protocols. - DNS Propagation: The process of updating DNS records across servers and caches. DNS PROPAGATION DNS propagation time varies depending on the DNS provider. Epycbyte typically takes 60 seconds for DNS records to propagate fully. DNS BEST PRACTICES - Check existing DNS records before switching providers. - Shorten TTL to 60 seconds about 24 hours before making changes. - Use tools like WhatsMyDNS.net to monitor propagation. - Update DNS records once propagation is confirmed. TROUBLESHOOTING Common issues include: - Misconfigured DNS records - Incorrect TTL settings - Slow propagation due to large TTL - CNAME loops RELATED TOPICS - Assigning domains to branches - Managing nameservers - Working with SSL certificates - Troubleshooting domain misconfigurations LAST UPDATED: October 29 2024

Last updated on Aug 05, 2025

08. projects / domains: working with domains

Working with Domains on Epycbyte: A Comprehensive Guide Overview of Domains A domain name is a unique identifier for a website or email address, allowing users to access specific services online. Domains are managed through domain registrars, and their configuration depends on the provider's tools. Buying a Domain Name Buying Through Epycbyte - Process: Epycbyte offers domain registration directly through its platform. - Features: Includes DNS settings and email configurations. Buying Through Third-Party Providers - Options: Many third-party registrars offer domains at different prices and with varying features. - Configuration: Requires manual setup in Epycbyte's dashboard. Domain Ownership and Project Assignment Who Owns the Domain? - Account or Team: Decide whether the domain is owned by an individual user or a team for better management. Subdomains, Wildcard Domains, and Apex Domains Subdomains - Use Case: For specific parts of a website (e.g., blog.example.com). - Configuration: Set up via CNAME records in Epycbyte's dashboard. Wildcard Domains - Use Case: For scalable projects (e.g., *.example.com). - Configuration: Requires nameserver setup and DNS records. Apex Domain - Example: acme.com. - Recommendation: Configure a redirect to www.acme.com for better control. Using Email with Domains MX Records - Purpose: Point email services (e.g., ImproxMX, Forward Email) via DNS settings. Troubleshooting Domain Issues Common Issues - Misconfigurations: Check DNS records and nameservers. - Invalid Entries: Ensure domain names are correctly entered. Conclusion

Last updated on Aug 05, 2025

08. projects: Domains Overview

Get Started Incremental Migration Frameworks Projects Builds Deployments Domains Working with Domains Working with DNS Working with Nameservers Working with SSL Supported Domains for Purchase Troubleshooting Domains Going Live Checklist Pricing Resources Dashboard Account Management Limits General Errors Error Codes Release Phases Private Registry Glossary Platform Domains Conceptual Domains Overview Learn the fundamentals of how domains, DNS, and nameservers work on Epycbyte. Table of Contents A domain is a user-friendly way of referring to the address access a website on the internet. For example, the domain you're reading this on is epycbyte.com. Domains can be analogous to the address where your house is. When someone sends a letter to your house, they don't need to know exactly where it is, they just need the address and the relevant post office handles routing the letter. The system that manages the details about where a site is located on the internet, is known as DNS or the Domain Name System. At its most basic, DNS maps human-readable domain names to computer-friendly IP addresses. When you request a site in your browser, the first step is converting the domain address to an IP address. That process is handled by DNS and called DNS Resolution. Understanding how DNS works is important to ensure that you are configuring your domain correctly. Diagram showing the a basic DNS query. You enter epycbyte.com in your browser. Your browser will first check its local DNS cache to see if it knows the IP address of epycbyte.com. If it does, it will request the site from that address. Your browser initiates a DNS query through a server known as a recursive resolver, usually provided by your ISP or a third-party. The recursive resolver acts as a middleman between the browser and DNS server and is used to increase the speed of the network. Recursive resolvers cache results to reduce latency and improve performance. The recursive resolver will contact the authoritative nameserver for epycbyte.com to get the correct IP address. Once the IP address is retrieved, the request can be fulfilled, and the website can be displayed in the browser. Nameservers are an important part of the DNS. They refer to the actual server that maintains and manages the DNS records. There are three types of nameservers: root nameserver, TLD nameserver, and the authoritative server. Root nameservers are responsible for providing the top-level domain (TLD) information, such as .com or .net. TLD nameservers handle specific domains, like google.com. The authoritative nameserver is the final authority for a domain's DNS records. It contains the most up-to-date and accurate information about the domain. CNAME records are used to map a domain name to its canonical name. For example, www.epycbyte.com might have a CNAME record pointing to epycbyte.com. A records map a domain to its IPv4 address. They are useful for locating servers that use IPv4 addresses. NS records specify the nameserver responsible for answering DNS queries for a particular domain. MX records indicate where mail should be sent for a domain. These are all described in more detail in Working with DNS. Nameservers are an important part of the DNS. They refer to the actual server that maintains and manages the DNS records. There are three types of nameservers: root nameserver, TLD nameserver, and the authoritative server. You can learn more about using a nameserver with Epycbyte in Working with nameservers. SSL Certificates are a way to show that there is a secure connection from your domain to your website. These are described in more detail in Working with SSL certificates. Interested in the Enterprise plan? Contact our sales team to learn more about the Enterprise plan and how it can benefit your team. Contact Sales Related Working with Domains Learn how domains work and the options Epycbyte provides for managing them. Working with DNS Learn how DNS works in order to properly configure your domain. Working with Nameservers Learn about nameservers and the benefits Epycbyte nameservers provide. Working with SSL Learn how Epycbyte uses SSL certificates to keep your site secure. Troubleshooting Domains Learn about common reasons for domain misconfigurations and how to troubleshoot your domain on Epycbyte. Last updated on October 29, 2024 Previous Inspecting Open Graph Metadata Next Working with Domains Was this helpful? supported. Send On this page Related Ask Ask v0 Ask Ask v0

Last updated on Aug 05, 2025

08. projects: environment variables

title: "Environment Variables" Environment Variables Environment variables are key-value pairs configured outside your source code to allow values to change based on the environment. These values are encrypted at rest and accessible to users with project access, making them safe for both sensitive and non-sensitive data, such as tokens. Table of Contents 1. Conceptual Overview 2. Creating Environment Variables 3. Size Limits 4. Environments 5. Using Environment Variables 6. Integration with Other Tools Conceptual Overview Environment variables are essential for flexible application behavior, allowing configurations to change without altering source code. They are particularly useful during builds and function executions. Creating Environment Variables Variables can be defined at the team or project level: - Team-level: Accessible across all projects in the team. - Project-level: Accessible only within a specific project. For more details, refer to Managing Environment Variables. Size Limits Epycbyte supports environment variables up to 64KB per deployment: - Total size includes all variables configured via dashboard or CLI. - No single variable can exceed 64KB. - Edge Functions and Middleware are limited to 5KB per variable. Supported runtimes include Node.js, Python, Ruby, Go, Java (11+), .NET, PHP, and custom runtimes via Build Output API. Environments Variables can be applied to specific environments: 1. Production: Applied to next deployment on the Production branch. 2. Preview: Applied to deployments from non-Production branches. 3. Development: Used locally with epycbyte dev. To apply variables selectively, use the CLI or manage branch-specific overrides. Using Environment Variables - Local Development: Defined in .env.local files or pulled using epycbyte env pull. - Integration: Automatically added via third-party tools like MongoDB. - Edge Functions: Limited to 5KB per variable. Integration with Other Tools Environment variables can be integrated with various tools and services, ensuring consistent configurations across different platforms. Conclusion Epycbyte's environment variables provide a secure and efficient way to manage application configurations. By leveraging team or project-level settings, you can optimize performance and flexibility across multiple environments.

Last updated on Aug 05, 2025

08. projects: Projects overview

Projects overview A project on Epycbyte represents an application deployed to the platform from a single Git repository. Each project can have multiple deployments, including a production deployment and several pre-production deployments. A project groups deployments and custom domains. Table of Contents - Next.js (/app) - Projects on Epycbyte represent applications deployed from a single Git repository. - Each project can have multiple deployments: a single production deployment and many pre-production deployments. - A project groups deployments and custom domains. - While each project is connected to a single imported Git repository, you can have multiple projects connected to a single Git repository with multiple directories, useful for monorepo setups. Project Dashboard - View an overview of the production deployment and any active pre-production deployments. - Configure project settings such as: - Custom domains - Environment variables - Deployment protection - More - View details about each deployment, including: - Status - Commit that triggered the deployment - Deployment URL - More - Manage observability for the project, including: - Web Analytics - Speed Insights - Logs - Manage the project's firewall. Project limits For more information on project limits, see Limits. Error codes - For detailed error code documentation, refer to the Error Codes section. Previous supported frameworks - Next.js (/app) - Next.js (/pages) - SvelteKit Send feedback Was this helpful? Please send your feedback.

Last updated on Aug 05, 2025

08. projects / project-configuration: Git Settings

Git Settings Once you have connected a Git repository to your project, select the Git menu item from your project settings page to edit your project’s Git settings. These settings include: Connected Git Repository - Managing production branch name: Ensure the correct branch is specified for production deployments. - Git Large File Storage (LFS): Enable or disable LFS support to manage large files in your repository. Git Large File Storage (LFS) If your repository contains LFS objects, enable LFS from project settings. Epycbyte will pull these objects when enabled. Ensure to redeploy after enabling LFS. Deploy Hooks Epycbyte supports deploy hooks, which are unique URLs that trigger deployments via HTTP POST requests. Check the deploy hooks documentation for more details. Ignored Build Step By default, Epycbyte creates a new deployment and build for each commit pushed to your connected Git repository. To skip the build step: 1. Select a project from the dashboard. 2. Go to Settings > Git. 3. In the Ignored Build Step section, choose the desired behavior: - Automatic: Always create a new build. - Only build production/preview: Build only when epycbyte_ENV is set accordingly. - Only build if changes: Build only when there are changes in the Git diff. - Custom commands: Run specific scripts or commands to control builds. Disconnecting Git Repository To disconnect a Git repository: 1. Select a project from the dashboard. 2. Go to Settings > Git. 3. Under Connected Git Repository, click Disconnect. This guide provides comprehensive details on managing Git settings for your Epycbyte project. Use it to optimize deployments and manage build steps effectively.

Last updated on Aug 05, 2025

08. projects: project configuration

To effectively configure redirects, rewrites, and headers in your Epycbyte setup, follow these organized steps: 1. Redirection Setup: - Use redirects for temporary URL changes or when you want the client to recognize the change (307 status). - Example: Redirect "/posts/:id" to "/blog/:id" temporarily. 2. Rewrite Configuration: - Utilize rewrites for dynamic content where the URL structure needs modification without altering the resource. - Example: Serve all unmatched paths with "/index.html" for SPAs. 3. Header Application: - Add headers to enhance caching, security, or other HTTP attributes. - Example: Set Cache-Control for favicon.ico and static assets. 4. Regex Handling: - In routes, escape dots to avoid unintended character matching. - Use rewrites without escaping dots for precise regex patterns. 5. Negative Lookaheads: - Implement negative lookaheads in routes to prevent infinite loops, redirecting only specific paths like "/maintenance". 6. Case Sensitivity: - Use rewrites or redirects instead of routes to avoid case-insensitive matching issues, ensuring unique URLs and SEO-friendly practices. 7. Order of Processing: - Epycbyte processes redirects before rewrites. - Headers are applied by default unless specified otherwise. 8. Testing and Validation: - Test each configuration individually, starting with simple redirects and gradually adding complexity. - Ensure headers don't interfere with existing configurations.

Last updated on Aug 05, 2025

09. deployments: build features

title: "Build Features for Customizing Deployments" Build Features for Customizing Deployments Epycbyte provides several features to customize your deployments, ensuring security and performance while allowing flexibility in how your project is managed and accessed. Private npm packages When working with private npm modules that require authentication, follow these steps: 1. Define NPM_TOKEN as an environment variable in your project. 2. Alternatively, define NPM_RC in the .npmrc file located at the root of your project folder. This allows Epycbyte to install and use your private dependencies securely. Ignored files and folders Epycbyte ignores specific files and folders by default to enhance security and performance: - Common ignored files: - .hg, .git, .gitmodules, .svn, .cache, .next, .now, .epycbyte, .npmignore, .dockerignore, .gitignore, *.swp, .DS_Store, .wafpick-*, .lock-wscript, .env.local, .env.*.local, .venv - Directories: - node_modules, __pycache__, venv, CVS These files are automatically excluded from deployment without needing to modify .epycbyteignore. Special paths Access your deployment's source code and build logs using special pathnames: - /_src: Redirects to the Deployment inspector for viewing sources. - /_logs: Provides real-time log streaming. These paths are protected by default but can be made public under Security settings in Project Settings. Git submodules Deploy Git submodules publicly via HTTP. Private or SSH-accessed submodules will fail during build unless referenced as npm packages in your package.json. For private repositories, use the provider-specific syntax in your dependencies.

Last updated on Aug 05, 2025

09. deployments / build-image: build image

Build Image Overview Title: "Build image" URL: "https://epycbyte.com/docs/deployments/build-image/build-image" Table of Contents 1. Incremental Migration Frameworks 2. Projects Builds 3. Configure a Build Package Managers Concurrent Builds Build Features 4. Build Image Overview 5. Build Image (legacy) 6. Troubleshoot a Build Deployments Domains Going Live Checklist Pricing Resources Platform Builds Build Image Overview The build image uses Amazon Linux 2023 as the base image and Epycbyte will automatically use it for all deployments with the 22.x or 20.x Node.js version project settings. Pre-installed packages - alsa-lib - at-spi2-atk - atk - autoconf - automake - bsdtar - bzip2 - bzip2-devel - cups-libs - expat-devel - gcc - gcc-c++ - git - glib2-devel - glibc-devel - gtk3 - gzip - ImageMagick-devel - iproute - java-11-amazon-corretto-headless - libXScrnSaver - libXcomposite - libXcursor - libXi - libXrandr - libXtst - libffi-devel - libglvnd-glx - libicu - libjpeg - libjpeg-devel - libpng - libpng-devel - libstdc++ - libtool - libwebp-tools - make - nasm - ncurses-libs - ncurses-compat-libs - openssl - openssl-devel - openssl-libs - pango - procps - readline-devel - ruby-devel - strace - tar - unzip - which - zlib-devel - zstd Running the build image locally docker run --rm -it amazonlinux:2023.2.20231011.0 sh When you are done, run exit to return. Installing additional packages You can install additional packages into the build container by configuring the Install Command within the dashboard or the "installCommand" in your epycbyte.json to use any of the following commands: - List all packages: dnf list - Search for a package: dnf search my-package-here - Install a package: dnf install -y my-package-here Legal Notice This document is updated on July 23, 2024. For previous versions, refer to the Build Image Overview (legacy). Supported. Send feedback if needed.

Last updated on Aug 05, 2025

09. deployments: Build Image Overview

Build Image Overview When you initiate a deployment on Epycbyte, your project is built within a container that uses a predefined image. This build image is determined by the Node.js version specified in your project settings. Table of Contents 1. Understanding Build Images 2. Determining the Build Image 3. Runtime Support 4. Build Image (Legacy) 5. Troubleshooting Builds Understanding Build Images A build image is a container image that provides the necessary environment and dependencies for building your application. Epycbyte supports multiple runtimes, each corresponding to different versions of Node.js. Determining the Build Image The build image used by Epycbyte is based on the Node.js version selected in your project settings: - Node.js 22.x: Uses Amazon Linux 2023. - Node.js 20.x: Uses Amazon Linux 18.x. - Node.js 18.x: Uses Amazon Linux 16.x. - Node.js 16.x: Uses the legacy build image (Amazon Linux 2). Runtime Support Epycbyte supports multiple runtimes, with versions varying by the selected build image: | Runtime | Build Image | Build Image (Legacy) | |---------|-------------|---------------------| | Node.js | 22.x | 20.x | | Node.js | 20.x | 18.x | | Node.js | 18.x | 16.x | | Node.js | 16.x | | | Edge | 3.12 | 3.9 | | Python | 3.9 | 3.6 | | Ruby | 3.3.x | 3.2.x | | Go | Community Runtimes | | Build Image (Legacy) The legacy build image (Amazon Linux 2) supports: - Node.js versions: 16.x, 18.x, 20.x, and 22.x. - Python versions: 3.6 and 3.9. Troubleshooting Builds If you encounter issues with builds, refer to the Build Features section for troubleshooting tips. This article provides a comprehensive overview of Epycbyte's build images and their associated runtimes. Last updated on September 27, 2024.

Last updated on Aug 05, 2025

09. deployments / builds: Package Managers

Package Managers Epycbyte supports various package managers for dependency management, ensuring optimal build performance. This guide outlines the supported package managers, how they are detected, and how you can manually specify a package manager for your project or deployments. Supported Package Managers The following table lists the package managers supported by Epycbyte, along with their install commands and versions: | Package Manager | Lock File | Install Command | Supported Versions | |-----------------|-----------|-----------------|-------------------| | Yarn | yarn.lock | yarn install | 1 | | npm | package-lock.json | npm install | 8, 9, 10 | | pnpm | pnpm-lock.yaml | pnpm install | 6, 7, 8, 9 | | Bun | bun.lockb | bun install | 1 | Epycbyte automatically detects the package manager based on the presence of a lock file in your project. If no lock file exists, it defaults to npm. Manually Specifying a Package Manager You can manually specify a package manager for your project or deployments. Here's how: Project Override To specify a package manager for all deployments in your project: 1. Navigate to your project on the Epycbyte dashboard. 2. Select the Settings tab. 3. Enable the Override toggle in the Build & Development Settings section. 4. Add your install command (e.g., pnpm install). 5. Save changes, and the specified package manager will be used on your next deployment. Deployment Override To specify a package manager for a single deployment: 1. Modify your epycbyte.json file in your project root. 2. Add an installCommand property with the desired command (e.g., "pnpm install"). 3. Epycbyte will use the oldest available version of the specified package manager during the build. Version Compatibility The specific version of a package manager used depends on the lock file's lockfileVersion: - For npm and pnpm, this is specified in their respective lock files. - For example, pnpm-lock.yaml with lockfileVersion: 9.0 uses pnpm 9. If no version is specified, Epycbyte defaults to using the latest compatible version based on Node.js versions. Conclusion Epycbyte simplifies dependency management by automatically detecting and using the appropriate package manager. For more details or assistance, feel free to ask!

Last updated on Aug 05, 2025

09. deployments: Builds

Builds Table of Contents When you create a new project or push a new commit to a project on Epycbyte, you initiate a deployment. A deployment is made up of a few steps: The build step Running checks Assigning a domain Build process You can initiate a Epycbyte deployment in two ways: with Epycbyte CLI or by pushing changes to a connected Git repository on GitHub, GitLab, or Bitbucket. It's also possible for deployments to be initiated through an integration using Epycbyte REST API. Depending on how you initiate the build, Epycbyte may put it in a queue to ensure we build things in the right order and only build the most recent deployment. Build step The build container will receive a request that there is a job available. The build container is a Docker container that uses an Amazon Linux based image and includes some pre-installed packages. Epycbyte first authenticates and inspects the request to confirm its authenticity and your permission to create the deployment to protect against unauthorized access and loss of integrity. At this point, Epycbyte also validates the Epycbyte configuration in the epycbyte.json file. Build container The build container runs in a few regions on our Edge Network⁠—you can determine which one by viewing your build logs. If you use Git to initiate your deployment, Epycbyte performs a shallow clone on your Git repository to fetch the most recent Git commit history. CLI deployments won't do this step—they'll follow the flow in the next step. Build process A POST request is made containing the project’s files to be uploaded (without these ignored files) to a scalable, secure, and highly durable data storage service. Once the source files have been uploaded, another POST request is made to start the build and deployment process. Epycbyte will check for an existing build cache key. If it finds one, it will restore the previous build cache. Build output The build container creates a build output that runs on one of Epycbyte's supported runtimes and provisions resources such as: Serverless Functions for handling API routes and server-side rendered pages Edge Functions for Middleware and other functions using the edge runtime Optimized Images Static output Build limits The maximum duration of the build is 45 minutes. When the limit is reached, the build will be interrupted and the deployment will fail. Pricing Manage and Optimize pricing Metric Description Priced Optimize Build Time The amount of time your Deployments have spent being queued or building Additional concurrent builds Learn More Number of Builds How many times a build was issued for one of your Deployments No N/A Build Process When you create a new project or push a new commit to a project on Epycbyte, you initiate a deployment. A deployment consists of several steps, including the build step. This guide focuses on understanding how the build process works when creating a Epycbyte Deployment. Understanding the Build Step The build step is a critical part of the deployment process. It involves validating and building your source code, outputting all assets into storage. Epycbyte supports various frameworks and will automatically configure the build settings for the most common configurations. If you have specific requirements, you can adjust build, output, and environment variables when creating or modifying your project. Initiating a Build You can initiate a deployment in two ways: 1. Using the Epycbyte CLI. 2. Pushing changes to a connected Git repository (GitHub, GitLab, Bitbucket). Deployments triggered by integration using the Epycbyte REST API are also possible. Depending on how you initiate the build, Epycbyte may queue it to ensure builds are processed in the correct order and only the most recent deployment is built. Build Container Details The build container is a Docker container based on Amazon Linux and includes pre-installed packages. It runs in multiple regions of Epycbyte's Edge Network, which you can view through your build logs. The container authenticates requests to ensure they are legitimate and verifies your permission to create the deployment, protecting against unauthorized access. Epycbyte creates a specific resource allocation for each build that cannot be increased. Each build cache can store up to 1 GB of data and is retained for one month. Currently, you cannot customize which files are cached. Build Limits The maximum duration of a build is 45 minutes. If this limit is reached, the build will be interrupted, and the deployment will fail. For more details, see "Cancelled Builds due to limits." This article provides a comprehensive overview of Epycbyte's build process, including how builds are initiated, container details, caching, and build limits.

Last updated on Aug 05, 2025

09. deployments: Working with Checks

Epycbyte automatically keeps an eye on various aspects of your web application using the Checks API. Learn how to use Checks in your Epycbyte workflow here. Table of Contents 1. Types of Flows Enabled by Checks API 2. Checks Lifecycle 3. Building Your Checks Integration Types of Flows Enabled by Checks API - Core Checks: - 200 responses on specific pages or APIs. - Determine the deployment's health and identify issues with code, errors, or broken connections. - Performance: - Collects core web vital information for specific pages and compares it with the new deployment. - Helps you decide whether to build the deployment or block it for further investigation. - End-to-end: - Validates that your deployment has all required components to build successfully. - Identifies any broken pages, missing images, or other assets. - Optimization: - Optimizes information about the bundle size. - Ensures your website manages large assets like package and image sizes. Checks Lifecycle The diagram shows how a check works: 1. When a deployment is created, Epycbyte triggers the deployment.created webhook. This tells integrators that checks can now be registered. 2. An integrator uses the Checks API to create checks defined in the integration configuration. 3. When the deployment is built, Epycbyte triggers the deployment.ready webhook. This notifies integrators to begin checks on the deployment. 4. Epycbyte waits until all created checks receive an update. 5. Once all checks receive a conclusion, aliases will apply, and the deployment will go live. Building Your Checks Integration To build your Checks integration: 1. Provide low or no configuration solutions for developers to run checks. 2. Include a guided onboarding process from installation to the end result. 3. Provide relevant information about the outcome of the test on the Epycbyte dashboard. 4. Document how to go beyond default behavior to build custom tests for advanced users. Was this helpful? Yes, it was! If you have more questions or need further assistance, feel free to ask below: Send

Last updated on Aug 05, 2025

09. deployments: configure a build

Configuring Builds on Epycbyte: A Guide to Settings and Options Epycbyte offers flexible build configurations to suit various project needs. Below is a breakdown of key settings that can help optimize your workflow. Overview Epycbyte provides comprehensive control over build processes, allowing you to customize aspects like frameworks, commands, and directories. This guide covers essential settings to ensure smooth deployments. Framework Preset - Selection: Choose from popular frameworks like Next.js or React. - Customization: If your project doesn't fit a specific framework, select "Other" for generic configurations. Build Command - Optional Field: Leave empty if no build is needed. - Usage: For static sites with HTML/CSS/JS, skip building by leaving the command empty. Output Directory - Purpose: Specify the root directory of your app. - Note: Ensure files are accessible within this directory; use relative paths like src instead of ... Install Command - For APIs: Customizable for Serverless Functions, especially useful for language-specific setups. - Language Support: Automatically adjusts based on the serverless function's language. Corepack - Experimental Feature: Enable by setting ENABLE_EXPERIMENTAL_COREPACK in your Project settings. - Package Managers: Choose a specific version like pnpm 7.5.1 for enhanced control. Development Command - Customization: Use epycbyte dev for local development with Serverless Functions. - Framework Support: Defaults to Next.js; customize if needed, ensuring the command includes $PORT. Skip Build Step - Use Case: For static sites without source changes. - Configuration: Select "Other" as the framework and leave the Build Command empty. Root Directory - Adjustment: Specify if your project's root differs from the repository structure. - CLI Usage: Use epycbyte <directory> instead of navigating through directories. Skipping Unaffected Projects - Monorepo Support: Skip deployments for unchanged projects by enabling the switch in Project Settings. Last Updated This guide was last updated on August 15, 2024.

Last updated on Aug 05, 2025

09. deployments: Exclude Files from Deployments with .epycbyteignore

Excluding Files from Deployments with .epycbyteignore The .epycbyteignore file is a powerful tool for managing deployments on Epycbyte. It allows you to specify which files and directories should be excluded during the deployment process. Overview The .epycbyteignore file works similarly to a .gitignore file but is specific to Epycbyte. By placing this file in your project's root directory, you can exclude certain files and directories from being uploaded or served on Epycbyte. How It Works 1. Default Exclusions: All files are excluded by default unless specified otherwise. 2. Including Files: To include a file or directory, add its name prefixed with ! in the .epycbyteignore file. 3. Structure: The file should be placed at the root level of your project. Example Usage To exclude specific files and directories: .epycbyteignore image private.html This will prevent the /image directory and /private.html file from being uploaded. Advanced Use Cases - Allowing Specific Files: Add !file_name to allow a specific file. - Ignoring All Files: Start with /* to ignore all files in the root directory. .epycbyteignore /* - Combining Rules: Use multiple lines to create complex rules. Monorepos and Workflows - In a monorepo, the .epycbyteignore file at the project root takes precedence over any other configuration. - If no .epycbyteignore is found, Epycbyte uses the one at the root level. Best Practices 1. Security: Use it to protect sensitive files like private.html. 2. Efficiency: Upload only necessary files, reducing deployment size and improving speed. 3. Compliance: Ensure compliance with specific project requirements by excluding unnecessary files. By leveraging the .epycbyteignore file, you can streamline your deployment process while keeping your project secure and efficient.

Last updated on Aug 05, 2025

09. deployments: generated urls

Accessing Deployments through Generated URLs When you create a new deployment, Epycbyte will automatically generate a unique URL which you can use to access that particular deployment. Table of Contents - When you create a new deployment - Viewing generated URLs - URL Components - Generated from Git - Generated with Epycbyte CLI - Truncation - Anti-phishing protection When you create a new deployment When you create a new deployment in either a preview or production environment, Epycbyte will automatically generate a unique URL in order for you to access that deployment. You can use this URL to access a particular deployment for as long as your set deployment retention policy allows. This URL is publicly accessible by default, but you can configure it to be private using deployment protection. Viewing generated URLs You can access these automatically generated URLs in the following ways: - On the command line when the build has completed. - When using Git, you can access either a URL for the branch or for each commit. To learn more, see Generated from Git. - Under the Project's Overview and Deployments tabs. URL Components Generated URLs are comprised of several different pieces of data associated with the underlying deployment. Varying combinations of the following information may be used to generate a URL: - Created when: The name of the Project that contains the deployment - Git branch, Git commit, CLI: A unique hash of 9 randomly generated numbers and letters - Git commit: The slug (not the name) of the account or team that contains the project/deployment - Git branch, Git commit, CLI: The name of the Git branch for which the deployment was created Generated from Git When working with Git, Epycbyte will automatically generate a URL for the following: - The commit: This URL will always show you a preview of changes from that specific commit. This is useful for sharing a specific version of your project at a point in time. - url-structure <project-name>-<unique-hash>-<scope-slug>.epycbyte.app - The branch: The URL generated from a Git branch will always show you the most recent changes for the branch and won't change if you push new commits to the branch. For this reason, this format is ideal for sharing with team members during your review process. - url-structure <project-name>-git-<branch-name>-<scope-slug>.epycbyte.app Generated with Epycbyte CLI To access the URL for a successful deployment from Epycbyte CLI, you can save the standard output of the deploy command. The generated URL will have the following structure: - url-structure <project-name>-<scope-slug>.epycbyte.app - If the deployment is created on a Team, you can also use the URL specific to the deployment's author. - url-structure <project-name>-<author-name>-<scope-slug>.epycbyte.app Truncation If more than 63 characters are present before the .epycbyte.app suffix (or the respective Preview Deployment Suffix) for a generated URL, they will be truncated. Anti-phishing protection If your <project-name> resembles a regular web domain, it may be shortened to avoid that resemblance. For example, www-company-com would be changed to just company. This is done to prevent an accidental trigger of anti-phishing protection built into web browsers that protect the user from visiting domains that look roughly like other domains they visit. Preview Deployment Suffix Preview Deployment Suffix is available on Pro and Enterprise plans. Preview Deployment Suffixes allow you to customize the URL of a preview deployment by replacing the default epycbyte.app suffix with a custom domain of your choice. To learn more, see the Preview Deployment Suffix documentation.

Last updated on Aug 05, 2025

09. deployments: git

Epycbyte and Git Deployment Overview 1. Project Setup and Production Branch Selection: - When creating a new project from a Git repository, Epycbyte automatically selects the production branch based on the presence of main or master branches. If these are absent, it uses the Git repository's default branch setting. 2. Team Management and Deployment Permissions: - Hobby Teams: Only team owners connected via their Git account can deploy commits. This ensures that non-owners cannot push changes unless they become members of a Pro team. - Pro Teams: Members can deploy after being added, with the commit author's identity checked against the team's Login Connections. 3. Pull Requests and Authorization: - After forking a public repository, pull requests may require authorization if they modify epycbyte.json or environment variables. This is a security measure to prevent sensitive information leaks. - If the commit author is already a team member, the authorization step is skipped. 4. Preview Branches and Phases: - Default Preview: Changes pushed to non-production branches (e.g., main) are served through preview domains like url-composition.project-name-branch.epycbyte.app. - Multiple Phases (e.g., Staging): Additional Git branches can be created to assign specific domains and environment variables, allowing for testing before merging into production. 5. Merging and Branch Management: - After testing in preview phases, changes are merged into the production branch. Preview branches are typically kept active for future use unless deleted.

Last updated on Aug 05, 2025

09. deployments: instant rollback

Performing an Instant Rollback on a Deployment Learn how to perform an Instant Rollback on your production deployments and quickly roll back to a previously deployed production deployment. Epycbyte provides Instant Rollback as a way to quickly revert to a previous production deployment. This can be useful in situations that require a swift recovery from production incidents, like breaking changes or bugs. Table of Contents - Epycbyte's Instant Rollback - How to roll back deployments - Select your project - Select the deployment to roll back to - Verify the information - Confirm the rollback - Successful rollback - Accessing Instant Rollback from Deployments tab - Who can roll back deployments? Epycbyte's Instant Rollback Epycbyte provides Instant Rollback as a way to quickly revert to a previous production deployment. This can be useful in situations that require a swift recovery from production incidents, like breaking changes or bugs. How to roll back deployments To initiate an Instant Rollback from the Epycbyte dashboard: 1. Select your project 2. Select the deployment to roll back to 3. Verify the information 4. Confirm the rollback Select your project Select your project on the Epycbyte dashboard. Select the deployment to roll back to From the project's overview page, click the Instant Rollback button next to the Production Deployment tile. Verify the information Verify that you are rolling back to the correct deployment and that there are no changes in Environment Variables. Confirm the rollback Confirm the rollback by clicking the Confirm Rollback button. Successful rollback The rollback happens instantaneously, and Epycbyte will point your domain and sub-domain back to the selected deployment. Accessing Instant Rollback from Deployments tab You can also roll back from the main Deployments tab in your dashboard. Filtering the deployments list by main is recommended to view a list of eligible roll back deployments. Who can roll back deployments? - Hobby plan: On the hobby plan, you can roll back to the previous deployment. - Pro and Enterprise plan: Owners and Members on these plans can roll back to any eligible deployment. Eligible deployments Deployments previously aliased to a production domain are eligible for Instant Rollback. Deployments that have never been aliased to production a domain, e.g., most preview deployments , are not eligible. Comparing Instant Rollback and manual promote options To compare the manual promotion options, see Manually promoting to Production .

Last updated on Aug 05, 2025

09. deployments: Accessing Build Logs

Accessing Build Logs When you deploy your website to Epycbyte, the platform generates build logs that show the deployment progress. These logs are particularly useful for debugging issues that may arise during deployment. Table of Contents 1. What are build logs? 2. How build logs work? 3. Link to build logs 4. Save logs What are build logs? Build logs contain information about: - The version of the build tools - Warnings or errors encountered during the build process - Details about files and dependencies installed, compiled, or built during deployment These logs help identify the root cause of deployment failures. How build logs work? - Build logs are generated at build time for all Deployments. - The logs resemble your framework's Build Command output but include additions from Epycbyte's build system. - Once a build is complete, no new logs will be recorded. In addition to build actions: - Errors are highlighted in red. - Warnings are highlighted in yellow. Link to build logs 1. Click the Build Logs button on the production deployment tile in the projects overview page. 2. View build logs for your Epycbyte deployments. Each log entry has a timestamp. Clicking it provides a link to the specific log entry, highlighted with #L6 as an anchor. - Hold the Shift key and click timestamps to create links for multiple lines (e.g., #L6-L9). - Links are only accessible to team members with valid Epycbyte accounts. Save logs - Use configurable log drains to export, store, and analyze build logs. - Log drains can be configured through the Epycbyte dashboard or logging integrations. Last updated on July 22, 2024. Previous Logs Runtime Logs Was this helpful? Send feedback.

Last updated on Aug 05, 2025

09. deployments: og preview

Inspecting your Open Graph metadata How-to Inspecting your Open Graph metadata You can use the Open Graph tab on every deployment on Epycbyte to validate and view your Open Graph (OG) data across a range of social media sites before you share it out. Routes using Deployment Protection can also be inspected. To view your data: - Choose your account or team from the scope selector - Select your project and go to the Deployments tab - From the Deployments tab, select the deployment you wish to view the metadata for - Select the Open Graph tab: - From here, you can view the metadata and a preview for Twitter , Slack, Facebook, and LinkedIn for specific pages in your deployment - Filter by pathname You can use the Path dropdown to view the OG card for any page on that particular deployment. Metadata These properties set by the Open Graph protocol . Property Value Description - title The title tag used to name the page. 70 characters max. This is used by default if no other values are passed. - description The meta.description tag used to describe the page. 200 characters max. This is used by default if no other values are passed. - og:image Absolute URL to image. Use the OG Image Generation documentation to create new images. - og:title A title for link previews. You can use this to override the meta title if you want the OG title to be different. - og:description A one to two sentence description for link previews. You can use this to override the meta description if you want the OG title to be different. - og:url A canonical URL for link previews. You should provide the absolute URL. Twitter-specific metadata Property content value Additional information - twitter:image A URL to an image file or to a dynamically generated image. og:image is used as a fallback. JPG , PNG , WEBP and GIF . SVG is not supported - twitter:card The type of card used for Twitter link previews summary , summary_large_image , app , or player - twitter:title A string that shows for Twitter link previews. og:title is used as a fallback. 70 characters max - twitter:description A description for Twitter link previews. og:description is used as a fallback. 200 characters max

Last updated on Aug 05, 2025

09. deployments: skew protection

Skew Protection Conceptual Skew Protection is available on Pro and Enterprise plans. Those with the owner, admin, member role can access this feature. Version skew occurs when different versions of your application run on client and server, causing application errors and other unexpected behavior. Epycbyte's Skew Protection resolves this problem at the platform and framework layer by using version locking , which ensures client and server use the exact same version. Enable Skew Protection Projects created after November 19th 2024 using one of the supported frameworks already have Skew Protection enabled by default. For older projects, you can enable Skew Protection in your project's settings. Select the project in the Epycbyte dashboard Select the Settings tab in the top menu Select the Advanced tab in the side menu Scroll down to Skew Protection and enable the switch For Enterprise Teams, you can also set a custom duration (see limitations ) Configure Skew Protection In some cases, you may have problematic deployments you want to ensure no longer resolves requests from any other active clients. Once you deploy a fix, you can Configure Skew Protection with the following: Select the deployment that fixed the problem in the deployment list Select the button (near the Visit button) Click Configure Skew Protection Click Save to apply the changes Monitor Skew Protection You can observe how many requests are protected from version skew by visiting the Monitoring page in the Epycbyte dashboard. For example, on the requests event, filter where skew_protection = 'active' . Supported frameworks Skew Protection is available with zero configuration when using the following frameworks: - Next.js - SvelteKit - Qwik - Astro - Nuxt ( coming soon ) Other frameworks can implement Skew Protection by checking if epycbyte_SKEW_PROTECTION_ENABLED has value 1 and then appending the value of epycbyte_DEPLOYMENT_ID to each request using one of the following options. Limitations Skew Protection is only available for Pro and Enterprise, not for Hobby teams. Skew Protection is enabled for 12 hours on Pro accounts and a custom duration on Enterprise accounts. Resources - Epycbyte Skew Protection - Read the announcement blog for Epycbyte's Skew Protection. - Version Skew Learn about version skew in depth from Epycbyte's CTO.

Last updated on Aug 05, 2025

10. analytics: Epycbyte Web Analytics

Epycbyte Web Analytics With Web Analytics, you can gain detailed insights into your website's visitors through metrics like top pages, referrers, and demographics. Below is a comprehensive guide to understanding and utilizing Epycbyte's Web Analytics. Table of Contents - Web Analytics Availability - Comprehensive Insights - Privacy Features - Integrated Infrastructure - Customizable Tracking - Quickstart Setup - Speed Insights - Visitors Overview - How Visitors Are Determined - Page Views Analysis - Panels and Data Breakdown - Bot Traffic Filtering Web Analytics Availability Web Analytics is available on all Epycbyte plans, providing essential tools for tracking website performance without additional costs. Comprehensive Insights Epycbyte's Web Analytics offers in-depth visitor analysis, including: - Top visited pages - Referrers for specific pages - Demographics such as location, OS, and browser information Privacy Features Web Analytics prioritizes user privacy by: - Storing only anonymized data - Avoiding cookies - Respecting visitor experience Integrated Infrastructure The analytics tool is seamlessly integrated into the Epycbyte platform, accessible via your project dashboard without third-party dependencies. Customizable Tracking You can configure Web Analytics to track: - Custom events - Feature flag usage - Visitor behavior Quickstart Setup For setup instructions, refer to the Quickstart. Speed Insights To analyze site performance, use Speed Insights. Visitors Overview The Visitors tab displays unique visitors within a selected timeframe. Adjust the timeframe using the dropdown and explore detailed visitor data through panels. How Visitors Are Determined Unlike traditional analytics tools, visitors are identified by a hash generated from each request: - Privacy-friendly approach - Hashes expire daily and reset - First visit tracked immediately Page Views Analysis The Page Views tab tracks every page load, counting multiple views per visitor. Panels and Data Breakdown Panels provide detailed analytics on: - Top pages - Referrers - Demographics (country, OS, device/browser) - Custom events - Feature flag usage Exporting Data You can export up to 250 entries from panels as CSV files. For details, see Exporting data as CSV. Bot Traffic Filtering Web Analytics excludes automated traffic by inspecting the User Agent header. This article provides a thorough overview of Epycbyte Web Analytics. Let us know if you found this helpful!

Last updated on Aug 05, 2025

11. cli: Epycbyte CLI Overview

Epycbyte CLI Overview Learn how to use the Epycbyte command-line interface (CLI) to manage and configure your Epycbyte Projects from the command line. Table of Contents - Installing Epycbyte CLI - Updating Epycbyte CLI - Checking the Version - Using in a CI/CD Environment - Available Commands Epycbyte provides a command-line interface (CLI) to interact with and configure your Epycbyte Projects. With this tool, you can manage Domain Name System (DNS) records, retrieve logs, and more, all from the terminal. Installing Epycbyte CLI To download and install Epycbyte CLI, run the following command: pnpm yarn npm pnpm i -g epycbyte Updating Epycbyte CLI When a new release of Epycbyte CLI is available, running any command will notify you. To update, use the installation command again: pnpm yarn npm pnpm i -g epycbyte@latest If you encounter permission issues, refer to npm's official guide. Checking the Version To verify the version of Epycbyte CLI, use the --version option: epycbyte --version Using in a CI/CD Environment In automated environments, log in using tokens. Create a token on your account page and use the --token option. Available Commands - alias - bisect - build - certs - deploy - dev - dns - domains - env - git - help - init - inspect - link - list - login - logout - logs - project - promote - pull - redeploy - remove - rollback - switch - teams - whoami This article provides a comprehensive guide to using Epycbyte CLI effectively.

Last updated on Aug 05, 2025

13. cron-jobs: cron jobs

title: "Cron Jobs" Cron Jobs Cron jobs are time-based scheduling tools used to automate repetitive tasks on Epycbyte. By using a specific syntax called a cron expression, you can define the frequency and timing of each task. This helps improve efficiency and ensures that important processes are performed consistently. What Are Cron Jobs? Cron jobs are automated tasks that run at predefined times. They are widely used for tasks such as: - Backups and archiving - Email and Slack notifications - Updating subscription quantities Epycbyte supports cron jobs for both Serverless and Edge Functions, allowing you to automate tasks with ease. How Epycbyte Supports Cron Jobs Epycbyte makes it simple to set up and manage cron jobs. You can add cron jobs through: - epycbyte.json - Build Output API For example, an endpoint like https://*.epycbyte.app/api/cron can be used to trigger a cron job. Common Use Cases 1. Automating Backups 2. Sending Email Notifications 3. Updating Subscription Quantities 4. Scheduling Social Media Updates How to Add Cron Jobs You can add cron jobs by defining them in your epycbyte.json file or using the Build Output API. Epycbyte supports the following cron expression format: minute hour day_of_month month day_of_week For example: - 0 5 * * * * triggers at 5 minutes past the hour. - * 5 * 5 * 5 triggers every minute on the 5th day of the month. Managing Cron Jobs When managing cron jobs, consider: - Duration: Define how often the job runs. - Error Handling: Ensure tasks are retried if they fail. - Deployments: Manage multiple environments with different schedules. - Concurrency Control: Avoid overlapping tasks. You can also run cron jobs locally for testing. Usage and Pricing For detailed information on usage limits, pricing, and deployment options, visit the Usage and Pricing page. Cron Job Templates 1. Cron OG Cards: A template for updating social media cards. 2. Epycbyte Cron Job Example: A Next.js app that updates data at different intervals. Get started in minutes by following the Quickstart guide.

Last updated on Aug 05, 2025

13. cron-jobs: Usage & Pricing for Cron Jobs

Usage & Pricing for Cron Jobs Cron jobs are available on all Epycbyte plans, offering flexible scheduling options to suit different needs. Below is a detailed overview of the usage and pricing details for cron jobs. Table of Contents - Cron Jobs Overview - Number of cron jobs per account - Hobby plan limits - Pro and Enterprise plans - Pricing Details Cron Jobs Overview Cron jobs can be triggered by Serverless or Edge functions, meaning the same usage and pricing limits apply to both. Number of cron jobs per account - Hobby plan: 2 cron jobs - Pro plan: 40 cron jobs - Enterprise plan: 100 cron jobs Each project has a hard limit of 20 cron jobs per project. Hobby Scheduling Limits On the Hobby plan, Epycbyte cannot guarantee timely cron job invocations. For example: - A cron job configured as 0 1 * * * (every day at 1 am) will trigger anywhere between 1:00 am and 1:59 am. - More specific execution times may require upgrading to the Pro plan. Pro and Enterprise Plans - Pro plan: Unlimited cron invocations - Enterprise plan: Unlimited cron invocations Pricing Details Cron jobs are included in all plans. However, using a function to invoke a cron job means that usage and pricing limits for these functions apply to all cron job executions. Functions Limits and Pricing - Last updated on July 23, 2024 This page provides comprehensive information about Cron Jobs usage and pricing. For more details, refer to the Manage Cron Jobs section.

Last updated on Aug 05, 2025

14. dashboard-features: Dashboard Overview

Dashboard Overview The Epycbyte dashboard provides a comprehensive overview of your projects, deployments, and account management. Below is a detailed guide on how to navigate and utilize the dashboard effectively. Table of Contents 1. Overview 2. Scope Selector 3. Integrations 4. Activity 5. Recent Previews 6. Domains 7. Usage 8. Settings 9. Command Menu Overview When you first log in to Epycbyte, you are greeted by the dashboard overview. This section displays information on all projects associated with your account, regardless of their status. Depending on the selected scope (Personal Account or Team), you can view projects belonging to either your personal account or any teams you are part of. Scope Selector The scope selector allows you to switch between different accounts and teams. To navigate between personal and team accounts, simply select the arrows next to your current account name. This feature ensures that you can easily access the dashboard for any team you are a part of. Additionally, clicking on the Epycbyte logo or the scope selector will take you back to your Team dashboard. Integrations Integrations extend the capabilities of Epycbyte by connecting it with third-party platforms and services. Available to all users and teams, integrations allow you to: - View and manage a list of existing integrations. - Browse the marketplace to install new integrations. - Create custom integrations through the Integrations Console. Activity The Activity Log provides a chronological record of events for your Hobby team. The types of events you see depend on the account type and role within a Team. This feature helps you track important updates and changes related to your projects. Recent Previews The Recent Previews panel offers a quick way to access recently deployed and viewed previews. Each preview displays the latest deployment ID, status, and associated pull request or git branch information. Selecting a preview navigates you to its live version, while clicking on the deployment ID provides more details. Domains Epycbyte allows you to manage multiple domains under one account. This feature is essential for organizing your projects and ensuring smooth domain management. Usage The Usage section tracks how resources like compute, storage, and functions are being used by your projects. This information helps you optimize costs and performance. Settings The Settings dashboard enables you to manage various aspects of your account and projects, including: - Git settings - Functions and environment variables - Security configurations - Project-specific settings Command Menu Epycbyte provides a keyboard navigation feature called the Command Menu. You can access it by pressing ⌘ + K on macOS or Ctrl + K on Windows/Linux. This feature allows you to quickly navigate through the dashboard and perform common actions without using the mouse. Was this helpful? supported. Send

Last updated on Aug 05, 2025

15. observability: audit log

The actions listed are part of a logging system within Epycbyte, tracking various configurations, integrations, security settings, project management, environment variables, and team operations. Here's an organized summary: 1. Edge Configurations: - edge_config.created: New configuration added. - edge_config.deleted: Configuration removed. - edge_config.updated: Configuration modified. 2. Integrations: - integration.created: Integration setup initiated. - integration.deleted: Integration removed. - integration.updated: Integration details changed. 3. Password Protection: - password_protection.enabled: Protection activated. - password_protection.disabled: Protection deactivated. 4. Preview Deployment Suffix: - preview_deployment_suffix.enabled: Custom URL suffix active. - preview_deployment_suffix.disabled: Suffix deactivated. - preview_deployment_suffix.updated: Suffix modified. 5. Project Management: - project.analytics.enabled/disabled: Analytics tracking status. - project.deleted: Project removal. - project.env_variable.created/updated/deleted: Environment variable management. - project.password_protection.enabled/disabled/updated: Password protection settings. - project.sso_protection.enabled/disabled/updated: SSO protection configuration. - project.transfer.in/out with statuses (completed, failed, started): Project transfer tracking. 6. Project Web Analytics: - project.web-analytics.enabled/disabled: Web analytics status. 7. Shared Environment Variables: - shared_env_variable.created/decrypted/deleted/updated: Shared variable management. 8. Team Operations: - team.avatar.updated: Team profile picture change. - team.created/deleted: Team creation and deletion. - team.name/slug.updated: Team name or slug modification. - team.member.actions: Access requests, additions, removals, join/left status, role updates.

Last updated on Aug 05, 2025

15. observability: Working with Checks

Working with Checks Epycbyte automatically keeps an eye on various aspects of your web application using the Checks API. Learn how to use Checks in your Epycbyte workflow here. Table of Contents 1. Types of flows enabled by Checks API 2. Checks lifecycle 3. Build your Checks Integration Types of flows enabled by Checks API - Core Checks: - 200 responses on specific pages or APIs. - Determine the deployment's health and identify issues with code, errors, or broken connections. - Performance: - Collects core web vital information for specific pages and compares it with the new deployment. - Helps decide whether to build the deployment or block it for further investigation. - End-to-end: - Validates that your deployment has all required components to build successfully. - Identifies any broken pages, missing images, or other assets. - Optimization: - Optimizes information about the bundle size. - Ensures your website manages large assets like package and image size. Checks lifecycle The diagram shows the complete lifecycle of how a check works: 1. When a deployment is created, Epycbyte triggers the deployment.created webhook. This tells integrators that checks can now be registered. 2. An integrator uses the Checks API to create checks defined in the integration configuration. 3. When the deployment is built, Epycbyte triggers the deployment.ready webhook. This notifies integrators to begin checks on the deployment. 4. Epycbyte waits until all created checks receive an update. 5. Once all checks receive a conclusion, aliases will apply, and the deployment will go live. Build your Checks Integration To build your Checks integration and publish it to the integration marketplace: 1. Provide low or no configuration solutions for developers to run checks. 2. A guided onboarding process for developers from installation to end result. 3. Relevant information about the outcome of the test on the Epycbyte dashboard. 4. Document how to go beyond the default behavior to build custom tests for advanced users. Was this helpful? Send feedback or ask a question below.

Last updated on Aug 05, 2025

15. observability: Working with Log Drains

Working with Log Drains Log drains are a powerful tool for capturing and analyzing logs from your applications. They allow you to store logs persistently, generate alerts, and create metrics for better insight into your application's behavior. Table of Contents - Introduction - Features of Log Drains - Pricing and Availability - Managing Active Log Drains - Integrating with Third-Party Services - Handling Errored Log Drains - IP Address Visibility - Conclusion Introduction Log drains are essential for monitoring and debugging your applications. They enable you to collect logs from various sources, store them securely, and analyze them for insights. Epycbyte provides robust log drain capabilities, available on Pro and Enterprise plans. Features of Log Drains - Persistent Storage: Store logs securely for future reference. - Large Volume Handling: Manage large volumes of logs efficiently. - Alerting and Metrics: Create alerts based on logging patterns and generate metrics for analysis. - Support for Multiple Protocols: Forward logs using HTTPS, HTTP, TLS, or TCP. Pricing and Availability - Pro and Enterprise Plans: Log drains are included in these plans. - Hobby Plan: Users can view log drain settings but cannot configure them. - Free Usage for Existing Users: Log drains created before May 23, 2024, on the Hobby plan remain free. Managing Active Log Drains 1. Access Settings: Go to your dashboard and select "Settings." 2. Select Log Drains: Choose from the available options. 3. Disable or Remove: Use the context menu to disable or remove log drains as needed. Integrating with Third-Party Services Epycbyte's Integration Marketplace offers a variety of logging services. Follow these steps: 1. Install Integration: Search for the desired service in the marketplace. 2. Configure Settings: Set up the integration according to the provider's instructions. 3. Connect Project: Choose a project to link with the service. Handling Errored Log Drains If more than 80% of log deliveries fail within an hour, Epycbyte sends notifications and marks the drain as errored. This helps in identifying issues quickly. IP Address Visibility Manage IP addresses associated with your log drains to ensure compliance with your organization's policies. Conclusion Log drains are a valuable tool for monitoring and analyzing logs. Proper management and integration can enhance your application's performance and security. For more details, refer to the Epycbyte documentation or contact support.

Last updated on Aug 05, 2025

15. observability: Monitoring

Monitoring Table of Contents - Overview - Example Queries - Save New Queries - Manage Saved Queries - Error Messages - Enable Monitoring - Disable Monitoring - Manage IP Address Visibility - More Resources Monitoring allows you to visualize and quantify the performance and traffic of your projects on Epycbyte. You can use example queries or create custom queries to debug and optimize bandwidth, errors, performance, and bot traffic issues in a production or preview deployment. Example Queries Use our pre-defined example queries to quickly analyze common metrics like top bandwidth images or other key performance indicators. Save New Queries You can save your favorite queries for future reference by accessing the left navigation bar. Personal queries are private to the user, while team queries require owner or member roles to access. Manage Saved Queries - Duplicate: Create a copy of the query. - Rename: Change the name of the query. - Delete: Remove the query permanently. Error Messages If you encounter issues like invalid queries, ensure all parameters are correct. Invalid queries may result in no data being displayed. Enable Monitoring To enable monitoring on Pro plans: 1. Navigate to the Monitoring tab in your dashboard. 2. Click "Enable Monitoring" and follow the confirmation steps to access the feature. Disable Monitoring To disable monitoring: 1. Go to Settings > Billing. 2. Toggle off the Monitoring section to disable it. Manage IP Address Visibility IP addresses are hidden by default for privacy reasons. To manage this: 1. Ensure your team is selected in the scope selector. 2. Navigate to Security & Privacy and toggle IP address visibility as needed. More Resources For detailed information on queries, limits, and pricing, refer to the Monitoring Reference and Limits & Pricing sections.

Last updated on Aug 05, 2025

15. observability: Frontend Observability

Frontend Observability Frontend observability on Epycbyte helps you monitor, analyze, and manage your applications. It includes monitoring to track usage and performance, auditing to record activities by team members, logging mechanisms to visualize real-time data, and more. Table of Contents - Observability helps you monitor, analyze, and manage your applications. - It includes monitoring to track usage and performance, auditing to record activities by team members, logging mechanisms to visualize real-time data, and more. - Observability on Epycbyte provides you with: - Health and performance metrics about your website. - Transparency with security investigations and compliance reporting. - An operational and extra-visible web application system. Runtime Logs - Runtime logs are available on all plans. - Those with the owner, member, developer role can access this feature. - Runtime logs allow you to search, inspect, and share your team's runtime logs at a project level. - You can search runtime logs from the deployments section inside the Epycbyte dashboard. - Your log data is retained for 3 days. - For longer log storage, you can use log drains. Web Analytics - Web analytics empowers you with in-depth insights into your website's visitors. - Showcases metrics such as top pages, referrers, and demographic data like countries, operating systems, and browser usage. Speed Insights - Epycbyte Speed Insights provides a convenient way to analyze your site's performance metrics directly within the dashboard. - Easily view and filter data based on device type, user percentiles, and reporting windows. - Helps you understand and optimize your site's speed and user experience. Monitoring - Available on Pro and Enterprise plans. - Allows you to visualize, explore, and track usage and traffic. - Using the query editor, you can create custom queries to gain greater insights into your application. - This allows you to debug issues and optimize all the projects on your Epycbyte Team. Activity Logs - Activity logs provide chronologically organized events on your personal or team account. - Overview of changes to your environment variables, deployments, and more. Audit Logs - Available in Beta on Enterprise plans. - Those with the owner role can access this feature. - Audit logs allow owners to track events performed by other team members. - The feature helps you verify who accessed what, for what reason, and at what time. - You can export up to 90 days of audit logs to a CSV file. Log Drains - Available on Pro and Enterprise plans. - Log drains allow you to export your log data, making it easier to debug and analyze. - You can configure log drains through the Epycbyte dashboard or through one of our log drains integrations. OpenTelemetry Collector - Available in Beta on Pro and Enterprise plans. - Epycbyte's OpenTelemetry (OTEL) collector enables seamless transmission of OTEL traces from your Serverless Functions to APM vendors like New Relic. - Traces provide valuable insights into your application's performance, helping pinpoint and resolve issues. Was this helpful? Send On this page: Runtime Logs, Web Analytics, Speed Insights, Monitoring, Activity Logs, Audit Logs, Log Drains, OpenTelemetry Collector.

Last updated on Aug 05, 2025

15. observability / otel-overview: Quickstart for using the Epycbyte OpenTelemetry Collector

Quickstart for using the Epycbyte OpenTelemetry Collector Frontend Observability, Web Analytics, Speed Insights, Monitoring, Logs OpenTelemetry Collector Tutorial: Quickstart Guide Learn how to get started with OTEL on Epycbyte to send traces from your Serverless or Edge Functions to application performance monitoring (APM) vendors. Table of Contents - Epycbyte's OpenTelemetry collector is available in Beta on Pro and Enterprise plans - Epycbyte has an OpenTelemetry (OTEL) collector that allows you to send OTEL traces from your Serverless or Edge Functions to APM vendors such as New Relic - The use of the OTEL collector is recommended due to improved performance, but not strictly required - If desired, you can configure the OTEL SDK to use custom trace exporters Prerequisites - You must be using an OTEL Integration: New Relic (Available in Beta to teams on Pro and Enterprise plans) - DataDog (Available in Beta to teams on Pro and Enterprise plans) Get Started 1. Install an OTEL integration to visualize traces - Select an integration from the Observability category in the Marketplace (such as DataDog or New Relic) - Click the Add Integration button to begin the installation and follow each step to add the correct Scope - If you already have installed an OTEL integration, you can skip this step 2. Enable traces - Ensure that tracing is enabled in your APM vendor's settings - Verify that your application is properly instrumented for OTEL collection 3. Initialize OTEL - Follow the instructions provided by your APM vendor to set up OTEL integration - For example: import { registerOTel } from '@epycbyte/otel' registerOTel({ serviceName: 'your-project-name', traceExporter: new MyCustomExporter() }) Start tracing requests in your project - Ensure that your application is sending metrics and traces to the Epycbyte collector - Verify that your APM vendor is receiving and processing the data correctly Deploy your project to Epycbyte - Follow the deployment instructions provided by Epycbyte - Make sure that your application is running in an environment where OTEL collection is enabled Custom OTEL exporters 1. Using custom OpenTelemetry setup with Sentry - If you are using Sentry v8+, follow the Sentry documentation to learn how to use your existing custom OpenTelemetry setup Ask a question - Do you have any questions about this guide? - Are there any specific steps or configurations that you need assistance with?

Last updated on Aug 05, 2025

15. observability: runtime logs

Understanding Runtime Logs: A Comprehensive Guide Table of Contents 1. What are Runtime Logs? 2. Available Log Types 3. Viewing and Filtering Logs 4. Log Levels 5. Function and Host Filtering 6. Deployment Options 7. Request Method and Path 8. Cache Status 9. Saving Log Presets 10. Search Capabilities 11. Log Details 12. Sharing Logs 13. Log Limits What are Runtime Logs? Runtime logs are a crucial component of monitoring and debugging applications, providing real-time insights into the operation and performance of your systems. These logs capture events as they occur, helping developers identify issues, track user interactions, and optimize system performance. Available Log Types Several types of logs are generated during runtime: - Request Logs: Capture details about incoming requests, including timestamps, HTTP status codes, and request methods. - Error Logs: Record errors and exceptions encountered by the application, aiding in troubleshooting. - Audit Logs: Track user actions for security and compliance purposes. - Debug Logs: Provide detailed information useful for development and debugging. Viewing and Filtering Logs To access runtime logs, you can use tools like web interfaces or APIs. These tools allow you to filter logs based on criteria such as: - Time Range: View logs from a specific timeframe. - Log Level: Filter by severity (e.g., Info, Warning, Error). - Function Name: Focus on logs generated by specific functions. - Host Name: Narrow down logs based on the domain or subdomain. Log Levels Logs are categorized by their severity levels: - Info: General information about system events. - Warning: Potential issues that may require attention. - Error: Significant issues affecting functionality. - Critical: Severe issues that must be addressed immediately. Function and Host Filtering You can filter logs by the function name or host name, allowing you to isolate logs from specific components or domains. Deployment Options Logs are generated based on the deployment environment (e.g., production, staging). This helps in tracking performance and issues across different environments. Request Method and Path Log entries often include details about the request method (e.g., GET, POST) and path, providing insights into user interactions with your application. Cache Status Logs may also include information about cache status, such as whether a request was served from the cache or fetched fresh. Saving Log Presets To streamline your workflow, you can save log filter configurations as presets. This allows you to quickly access common viewing setups. Search Capabilities In addition to filtering, you can search logs using keywords in the message field, helping to locate specific issues or patterns. Log Details When viewing individual log entries, detailed information is displayed, including: - Timestamp - HTTP Status Code - Host Name - Request ID - User Agent - Log Level - Cache Status - Function Name - Location (geographical region) - Runtime Environment - Duration/Latency - Error Details Sharing Logs Logs can be shared with team members or external stakeholders, facilitating collaboration and troubleshooting. Log Limits The amount of data retained and the speed at which logs are processed depend on your plan. Higher-tier plans typically offer more storage and faster processing times. By leveraging these features, you can effectively monitor and manage your application's runtime behavior, ensuring optimal performance and user experience.

Last updated on Aug 05, 2025

16. speed-insights: metrics

Understanding Website Performance Metrics: A Comprehensive Overview 1. Key Performance Metrics: - First Contentful Paint (FCP): Measures the time taken for the first content to be rendered on a page, crucial for initial load speed. - First Input Delay (FID): Assesses the delay before a page responds to user interactions, such as clicks. 2. Percentile Calculations: - Metrics like P75, P90, and P95 indicate that 75%, 90%, and 95% of users experience faster load times than the specified value. For example, a P75 score of 1 second means 75% of users see FCP in under a second. 3. Color-Coding System: - Scores are color-coded as Poor (0-49), Needs Improvement (50-89), and Good (90-100). Aiming for 'Good' scores is recommended for optimal user experience, though higher scores become increasingly challenging to achieve. 4. Real Experience Score (RES) vs. Virtual Experience Score (VES): - RES: Based on real user data post-deployment, providing insights after the fact. - VES: Predictive, using metrics like Total Blocking Time (TBT), allowing anticipation of performance changes pre-deployment. 5. Device Considerations: - Device type affects scores; Checkly uses "Desktop" for VES, potentially underweighting mobile users. Weighting strategies may be necessary for comprehensive analysis. 6. Effort vs. Outcome: - Improving from 99 to 100 on RES is harder than moving from 90 to 94 due to diminishing returns as scores approach perfection. 7. Impact on User Experience and Search Rankings: - Higher scores enhance user experience but may not directly boost search rankings unless they move into higher color categories, like green from orange. 8. Metric Differences: - VES uses TBT instead of FID, offering a broader view of user experience by considering total interaction delays. 9. Data Updates and Analysis: - RES is updated with recent user data, while VES relies on automated checks for predictions. 10. Targeted Optimization: - By identifying user drop-off points and slow responses, developers can focus on specific areas to enhance overall performance.

Last updated on Aug 05, 2025

16. speed-insights: Speed Insights Overview

Speed Insights Overview Frontend Observability - Web Analytics - Speed Insights - Quickstart - Limits & Pricing - Troubleshooting - Monitoring - Logs - OpenTelemetry Collector - Checks - Observability Speed Insights Conceptual - Overview - This page lists and explains all performance metrics provided by Epycbyte's Speed Insights feature. - Table of Contents Speed Insights Availability - Available on all plans - Provides detailed view of website performance metrics based on Core Web Vitals - Enable data-driven decisions for optimizing your site Data Collection - Granular visitor data using Web Analytics - Tracking in all environments: preview, production, and deployments Dashboard View - Accessible via Epycbyte dashboard after enabling Speed Insights - Project view: Speed Insights tab - Parameters: - Device type: Mobile or desktop toggle - Environment: Preview, production, or all - Time range: Predefined or custom - Performance metric: RES (Real Experience Score), FCP (First Contentful Paint), LCP (Largest Contentful Paint) Performance Metric Views - Line graph (P75, P90, P95, P99) - Kanban board (routes, paths, HTML elements) - Geographical map (P75 score with color intensity) Quickstart - Enable Speed Insights and access dashboard - Sort and inspect data based on parameters Usage & Pricing - Costs based on data points - Data calculation details - Performance metrics explanation Last Updated - July 23, 2024 This page provides comprehensive insights into Epycbyte's Speed Insights feature. For more information, refer to the Quickstart guide or explore related articles.

Last updated on Aug 05, 2025

17. security: Access Control

Access Control Epycbyte implements several measures to ensure the security and compliance of your data, including DDoS mitigation, SOC 2 compliance, and more. This article outlines Epycbyte's approach to access control, focusing on password protection, SAML SSO, and other security features. Table of Contents 1. Compliance Measures 2. Password Protection 3. Epycbyte Authentication 4. SAML SSO 5. Deployment Protection Compliance Measures Epycbyte adheres to strict compliance standards to protect your data. Key measures include: - DDoS Mitigation: Advanced systems to prevent distributed denial-of-service attacks. - SOC 2 Compliance: Ensures secure handling of your data in the cloud. Password Protection Password protection is available for both Preview and Production deployments, depending on the plan: - Teams on Pro and Enterprise Plans: Password protection is enabled by default. - Teams on Enterprise Plan Only: SAML SSO is required for access control. Both methods can be used to protect Preview and Production environments. Epycbyte Authentication Epycbyte Authentication adds an extra layer of security for Preview and Production deployments: - Personal Accounts with Team Membership: Users must use their login credentials to access the deployment. - Enabled via Teams Project Dashboard: This feature can be turned on separately. Epycbyte Authentication can be combined with Password protection, allowing users to choose between methods when accessing the deployment. SAML SSO SAML Single Sign-On (SSO) is supported for Enterprise plan users: - Version v0: This implementation is currently in use. - Security and Compliance: Ensures secure access control for sensitive deployments. Deployment Protection Epycbyte provides robust protection for your deployments: - Preview Deployments: Both password protection and Epycbyte Authentication are available. - Production Deployments: Password protection, SAML SSO, or a combination of both can be used. References 1. Password Protection Documentation 2. Epycbyte Authentication Documentation This article is based on version v0 of the Epycbyte documentation.

Last updated on Aug 05, 2025

17. security: DDoS Mitigation

Table of Contents 1. Compliance Measures 2. Firewall 3. TLS Fingerprints 4. Access Control 5. SAML SSO 6. HTTPS/SSL 7. Open System Interconnection (OSI) Model 8. Layer 3 DDoS 9. Layer 4 DDoS 10. Layer 7 DDoS 11. What to do in case of a DDoS attack 12. Bypass System-level Mitigations 13. Billing for DDoS 14. Contact Us Compliance Measures Epycbyte ensures compliance with industry standards and regulations to provide secure solutions for your business needs. Firewall The firewall acts as a barrier between trusted internal networks and untrusted external networks, protecting against unauthorized access. TLS Fingerprints TLS fingerprinting is used to identify malicious traffic by comparing server certificates to known legitimate ones. Access Control Strict access control ensures that only authorized users can access sensitive data or resources. SAML SSO Single Sign-On (SSO) simplifies user authentication across multiple systems, reducing the need for multiple passwords. HTTPS/SSL Secure Sockets Layer (SSL) encrypts data during transmission to protect it from being intercepted. Open System Interconnection (OSI) Model The OSI model defines seven layers of networking: physical, data link, network, session, presentation, and application. Layer 3 DDoS Layer 3 attacks target the data link layer, flooding the network with traffic to cause congestion. Layer 4 DDoS Layer 4 attacks involve the transport layer, overwhelming servers with too many connections. Layer 7 DDoS Layer 7 attacks target the application layer, overwhelming servers with requests. What to do in case of a DDoS attack 1. Monitor traffic for unusual patterns. 2. Use DDoS protection tools. 3. Engage with security teams to analyze and mitigate threats. 4. Update systems with patches to prevent vulnerabilities. Bypass System-level Mitigations To disable automatic mitigations temporarily: - Click the menu button in the Firewall tab. - Select "Disable System Mitigations." - Confirm in the dialog box for 24 hours. Billing for DDoS Epycbyte automatically mitigates L3, L4, and L7 DDoS attacks at no extra cost. You are billed for legitimate traffic and any illegitimate traffic that bypasses mitigation. Contact Us For support or questions, visit the Epycbyte website or contact their customer service team.

Last updated on Aug 05, 2025

17. security / deployment-protection / methods-to-bypass-deployment-protection: Protection Bypass for Automation

Protection Bypass for Automation Overview Epycbyte's Protection Bypass for Automation feature allows you to temporarily bypass deployment protection mechanisms, such as password protection, authentication, and trusted IPs, for automated tools like end-to-end testing. This is particularly useful when automating processes that require access to your application. Table of Contents 1. Compliance Measures 2. Methods to Bypass Deployment Protection 3. Protection Bypass for Automation 4. How-to Protection Bypass for Automation Compliance Measures - Shared Responsibility: Ensure that compliance measures are shared between your team and Epycbyte. - Firewall Access Control: Implement strict firewall rules to control access to sensitive areas. - SAML/SSO: Use Single Sign-On (SSO) solutions for secure authentication. - HTTPS/SSL: Enforce HTTPS to secure data in transit. - Directory Sync: Synchronize directories for user management. Methods to Bypass Deployment Protection - Password Protection: Temporarily disable or bypass password protection for automated tools. - Epycbyte Authentication: Use generated secrets to bypass Epycbyte's authentication layer. - Trusted IPs: Extend trusted IP ranges for testing environments. Protection Bypass for Automation - Generated Secret: A secret value is created to bypass deployment protection. - System Environment Variable: The secret is automatically added as epycbyte_AUTOMATION_BYPASS_SECRET to your deployments. - Redeployment Requirement: Updating the secret will require redeploying your application. Who Can Manage Protection Bypass for Automation? - Team Members: Those with at least the "Member" role can manage bypass settings. - Project Administrators: Users with the "Project Administrator" role have full control. Using Protection Bypass for Automation 1. Set Header or Query Parameter: Use x-epycbyte-protection-bypass header or query parameter with your generated secret. x-epycbyte-protection-bypass: your-generated-secret 2. Optional Advanced Configuration: - Cookie Bypass: Set x-epycbyte-set-bypass-cookie to true for in-browser testing. x-epycbyte-set-bypass-cookie: true - SameSite Configuration: For iframe scenarios, set samesitenone. x-epycbyte-set-bypass-cookie: samesitenone Example: Playwright Configuration const config: PlaywrightTestConfig = { use: { extraHTTPHeaders: { 'x-epycbyte-protection-bypass': process.env.epycbyte_AUTOMATION_BYPASS_SECRET, 'x-epycbyte-set-bypass-cookie': true | 'samesitenone' // optional } } }; Last Updated Last updated on July 24, 2024. Was this helpful? Send your feedback.

Last updated on Aug 05, 2025

17. security / deployment-protection / methods-to-protect-deployments: password protection

Password Protection Compliance Measures Shared Responsibility Firewall Access Control SAML SSO HTTPS/SSL Directory Sync Secure Backend Access Deployment Protection Methods to Protect Deployments Epycbyte Authentication Password Protection Trusted IPs Methods to Bypass Deployment Protection Deployment Retention Audit Logs Protected Git Scopes Security Deployment Protection Methods to Protect Deployments Password Protection How-to Password Protection Learn how to protect your deployments with a password. Table of Contents - Password Protection - How it works - Security considerations - Managing Password Protection Password Protection Password Protection is available on Enterprise plans or with the Advanced Deployment Protection add-on for Pro plans. Those with the owner, member, and admin roles can manage Password Protection. With Password Protection enabled, visitors to your deployment must enter the pre-defined password to gain access. You can set the desired password from your project settings when enabling the feature, and update it any time. How it works Deployment protected with Password Protection authentication screen. Security considerations The table below outlines key considerations and security implications when using Password Protection for your deployments on Epycbyte. | Consideration | Description | | --- | --- | | Environment Configuration | Can be enabled for different environments. See Understanding Deployment Protection by environment | | Compatibility | Compatible with Epycbyte Authentication and Trusted IPs Bypass Methods | | Password Persistence | Users only need to enter the password once per deployment, or when the password changes, due to cookie set by the feature being invalidated on password change | | Password Changes | Users must re-enter a new password if you change the existing one | | Disabling Protection | All existing deployments become unprotected if you disable the feature | | Token Scope | JWT tokens set as cookies are valid only for the URL they were set for and can't be reused for different URLs, even if those URLs point to the same deployment | Managing Password Protection You can manage Password Protection through the dashboard, API, or Terraform: Using the Dashboard 1. Go to Project Deployment Protection Settings 2. Select the project that you wish to enable Password Protection for 3. Go to Settings then Deployment Protection 4. Manage Password Protection 5. From the Password Protection section: - Use the toggle to enable the feature - Select the deployment environment you want to protect - Enter a password of your choice - Finally, select Save All your existing and future deployments will be protected with a password for the project. Next time when you access a deployment, you will be asked to log in by entering the password, which takes you to the deployment. A cookie will then be set in your browser for the deployment URL so you don't need to enter the password every time. Using the API You can manage Password Protection using the Epycbyte API endpoint to update an existing project with the following body: deploymentType: prod_deployment_urls_and_all_previews urls_and_all_previews: Standard all: All Deployments preview: Only Preview Deployments password: <password> // enable / update password protection To disable password protection, use the following body: { "passwordProtection": null } Using Terraform You can configure Password Protection using password_protection in the epycbyte_project data source in the Epycbyte Terraform Provider.

Last updated on Aug 05, 2025

17. security / deployment-protection: Methods to Protect Deployments

Methods to Protect Deployments Compliance Measures Epycbyte provides several compliance measures to ensure your deployments are secure and meet industry standards. These include: - Shared Responsibility Firewall: Ensures that only authorized traffic reaches your applications. - Access Control: Restricts access based on role-based permissions. - SAML SSO: Single sign-on for secure authentication across multiple systems. - HTTPS/SSL: Encrypts data in transit, ensuring secure communication. Epycbyte Authentication Epycbyte Authentication is available on all plans and offers robust access control: - Restrict access to non-public deployments to only users with a Epycbyte account or shared via Sharable Links. - Users without permission can request access, triggering notifications for branch authors. - Enable via the Dashboard > Settings > Deployment Protection. Password Protection Available on Enterprise plans or with the Advanced Deployment Protection add-on for Pro plans: - Restrict access to both non-public and public deployments based on environment type. - Configure settings to control who can view or modify sensitive data. Trusted IPs Trusted IPs are available on Enterprise plans: - Limit deployment access to specific IPv4 addresses or CIDR ranges. - Returns 404 for unauthorized IPs, ideal for VPNs or external proxies. More Resources - Understanding Deployment Protection by Environment: Learn how to protect different environments. - Methods to Bypass Deployment Protection: Understand potential threats and how to mitigate them. - Last updated on August 7, 2024 This article provides a comprehensive guide to protecting your deployments using Epycbyte's tools. For more information or assistance, visit the Epycbyte documentation.

Last updated on Aug 05, 2025

17. security: Deployment Protection on Epycbyte

Deployment Protection on Epycbyte Table of Contents 1. Compliance Measures 2. Features 3. Configuration Steps 4. Bypass Methods 5. Advanced Deployment Protection Compliance Measures - Shared Responsibility: Ensure compliance with security standards. - Firewall Access Control: Implement and maintain firewall rules. - Regular Audits: Conduct regular security audits to identify vulnerabilities. Features Epycbyte Authentication - Secure access control for your deployments. - Multi-factor authentication (MFA) support. Password Protection - Automatically rotate and manage passwords. - Enforce strong password policies. Trusted IPs - Restrict access to known trusted IP ranges. - Enhance security by allowing only specific IPs. Configuration Steps Understanding Deployment Protection by Environment Protecting Preview and Generated URLs with Standard Protection - Step 1: Enable Standard Protection in your project settings. - Step 2: Configure authentication methods (e.g., MFA). - Step 3: Set up Trusted IPs if needed. Migrating to Standard Protection - Step 1: Review current deployment configurations. - Step 2: Update access control policies. - Step 3: Enable Standard Protection and monitor. Protecting Only Preview Deployments - Step 1: Select "Only Preview Deployments" in project settings. - Step 2: Configure authentication and Trusted IPs as needed. Protecting Only Production Deployments - Step 1: Select "Production Deployments" in project settings. - Step 2: Use Trusted IPs to restrict access. Protecting All Deployments - Step 1: Enable "All Deployments" protection. - Step 2: Configure authentication and IP rules. Advanced Deployment Protection Enabling Advanced Deployment Protection - Step 1: Navigate to Project Settings > Deployment Protection. - Step 2: Select features like Password Protection or Trusted IPs. - Step 3: Enable Advanced Deployment Protection (available via add-on for Pro plans). Disabling Advanced Deployment Protection - Step 1: Go to Team Settings > Billing. - Step 2: Edit the feature and follow instructions to disable. - Note: Requires 30 days of active use before disabling. Methods to Protect Deployments - Use Epycbyte's built-in security features. - Regularly review and update access controls. - Monitor logs for suspicious activities. Methods to Bypass Deployment Protection - No Bypass Support: Epycbyte does not support bypassing protection. - Use Disallowed Methods: Attempting to bypass is against Epycbyte's policies. - Contact Support: For legitimate security concerns, contact Epycbyte support. Conclusion Deployment Protection on Epycbyte ensures secure access and compliance. Configure settings wisely and stay informed about updates.

Last updated on Aug 05, 2025

17. security: Deployment Retention

Deployment Retention Deployment retention refers to the configured policies that determine how long different types of deployments are kept before they are automatically deleted. These retention policies allow you to control how long your deployment data is stored, providing enhanced protection, compliance support, and efficient storage management. Epycbyte provides unlimited deployment retention for all deployments, regardless of the plan you are using. You can configure retention durations for the following deployment states: - Canceled deployments - Errored deployments - Preview deployments - Production deployments For example, if you set a 60-day retention period for a production deployment created on 01/01/2024 and later replace it with a newer deployment, the origin deployment will expire on 03/01/2024. Users accessing it will see a 410 status code until 03/31/2024, when all associated resources are permanently removed. Once a retention policy is enabled, deployments within the retention period will start to be automatically marked for deletion within a few days of enabling the policy. Setting a Deployment Retention Policy To configure a retention policy: 1. Navigate to the Settings tab of your project. 2. Select Security on the side panel. 3. Scroll down to the Deployment Retention Policy section. 4. Select the dropdown menu with the appropriate duration. 5. Save the new retention policy for your project. Viewing Deployment Retention Policy You can view your deployment retention policy using the Epycbyte CLI with the following command: terminal epycbyte list [project-name] [--policy errored = 6 m] Restoring a Deleted Deployment If a deployment is marked for deletion accidentally or as part of the retention policy, it can be restored within a 30-day recovery period. To restore a deleted deployment: 1. Navigate to the Settings tab of your project. 2. Select Security on the side panel. 3. Scroll down to the Recently Deleted section. 4. Find the deployment that needs to be restored and click on the dropdown menu item Restore. 5. Complete the modal. Note: For advanced deployment restoration support, contact us to discuss Enterprise options. Contact Sales for more details. Table of Contents 1. Deployment Retention is available on all plans. 2. Deployment retention refers to configured policies that determine how long deployments are kept before automatic deletion. 3. These policies provide: - Enhanced protection - Compliance support - Efficient storage management 4. Epycbyte provides unlimited deployment retention for all deployments. How-to - Learn how deployment retention policies affect a deployment's lifecycle. - Understand the steps to set, view, and restore deleted deployments. Troubleshooting - If you encounter issues with deployment retention, refer to the troubleshooting guide. Last updated on October 11, 2024.

Last updated on Aug 05, 2025

17. security: Directory Sync

Directory Sync Directory Sync is a feature available on Epycbyte's Enterprise plans, designed to help teams manage organization membership from third-party identity providers like Google Directory or Okta. Similar to SAML Single Sign-On, Directory Sync is only accessible to Team Owners and requires configuration by the team owner. Compliance Measures - Shared Responsibility: Ensure that at least one team member retains the owner role to avoid account lockout. - Firewall Access Control: Implement appropriate firewall rules to secure access to sensitive data. - SAML SSO/HTTPS/SSL: Enable SAML Single Sign-On and ensure HTTPS/SSL for secure communication. Directory Sync Overview Directory Sync automatically synchronizes changes from your directory provider (e.g., Okta) with your Epycbyte Team. This includes: - Adding new users: Automatically sends invitations to join the team. - Removing users: Automatically revokes access from the team. - Role mapping: Configures roles based on groups from your directory provider (e.g., Engineers as Members, Admins as Owners). Configuration Steps 1. Scope Selection: Ensure your team is selected in the scope selector. 2. Access Settings: Navigate to the Settings tab under Security & Privacy. 3. SAML Configuration: Select Configure under SAML Single Sign-On to start the setup process. 4. Role Mapping: Map directory groups to Epycbyte roles (e.g., Okta Admins as Owners, Engineers as Members). 5. Confirmation: Review the changes and click Confirm and Sync to finalize the configuration. Preventing Account Lockout - Maintain at least one owner role within your team. - Use predefined group names for automatic role allocation: - epycbyte-role-owner: Owner - epycbyte-role-member: Member - epycbyte-role-developer: Developer - epycbyte-role-billing: Billing - epycbyte-role-viewer: Viewer - epycbyte-role-contributor: Contributor Supported Providers - Okta - Google Directory - Other supported SAML providers (refer to SAML Single Sign-On documentation) Conclusion Configuring Directory Sync ensures seamless team management while maintaining security and compliance. For assistance, contact Epycbyte's sales team.

Last updated on Aug 05, 2025

17. security: encryption

HTTPS/SSL Overview Out of the box, every Deployment on Epycbyte is served over an HTTPS connection. This ensures that web content is always served over a secure connection, which helps protect users' data and privacy. Table of Contents 1. Supported TLS Versions 2. TLS Resumption 3. OCSP Stapling 4. Supported Ciphers 5. Support for HSTS 6. How Certificates Are Handled 1. Supported TLS Versions Epycbyte supports TLS version 1.2 and TLS version 1.3. 2. TLS Resumption Epycbyte supports both Session Identifiers and Session Tickets as methods for resuming a TLS connection. This can significantly improve Time To First Byte (TTFB) for second time visitors. 3. OCSP Stapling To ensure clients can validate TLS certificates as quickly as possible, we staple an OCSP response allowing them to skip the certificate verification process for known good certificates. This improves performance and reduces latency. 4. Supported Ciphers The following ciphers are supported: - AES GCM - ChaCha20 - X25519 5. Support for HSTS Epycbyte supports HTTP Strict Transport Security (HSTS) by default, which ensures that browsers will always use HTTPS for the given domain even if the cache is cleared. 6. How Certificates Are Handled - Pre-generated certificates: When custom certificates are generated using epycbyte certs issue, their keys are placed in our database and encrypted at rest within the Network layer. - Certificate rotation: Both the certificate and key are cached in memory for optimal SSL termination performance. Full Specification For detailed information on encryption mechanisms, refer to SSL Labs. You only need to make sure to select any IP address of your choice (it does not matter which one you pick – the results are the same for all). Last Updated Last updated on July 17, 2024.

Last updated on Aug 05, 2025

17. security: Epycbyte Firewall

Epycbyte Firewall The Epycbyte Firewall is a comprehensive security solution designed to protect your applications and websites from DDoS attacks and unauthorized access. It provides a robust set of tools and infrastructure tailored for enhanced security. Table of Contents 1. Compliance Measures 2. DDoS Mitigation 3. Attack Challenge Mode 4. Web Application Firewall (WAF) 5. TLS Fingerprints 6. Access Control Compliance Measures Epycbyte Firewall adheres to shared responsibility compliance measures, ensuring that security is a collaborative effort between Epycbyte and its customers. Key measures include: - DDoS Mitigation: Automated DDoS mitigation is available on all plans. - Attack Challenge Mode: Allows customers to customize rules for additional control during high traffic. - Web Application Firewall (WAF): Available on all plans, enabling IP blocking and custom rules. - TLS Fingerprints: Advanced security features using JA3/JA4 TLS fingerprints. - Access Control: Secure backend access with directory sync and protected git scopes. DDoS Mitigation Epycbyte Firewall provides robust DDoS mitigation across all plans. Key features include: - Automated Prevention: The system automatically blocks abnormal or suspicious traffic. - Dedicated Support for Enterprises: Available on Enterprise plans, offering tailored support for significant attacks. - Real-time Alerts: Notifications via webhooks or Slack to address potential threats. Attack Challenge Mode Attack Challenge Mode is designed to give customers greater control during high traffic situations. It works alongside Epycbyte's automated DDoS mitigation to ensure legitimate traffic only reaches your applications. Web Application Firewall (WAF) Epycbyte WAF allows you to customize the firewall by blocking specific IP addresses or applying rules through the dashboard. This adds an extra layer of security beyond basic DDoS mitigation. TLS Fingerprints For advanced security needs, Epycbyte Firewall offers TLS fingerprints using JA3/JA4 technology. These unique session identifiers help detect persistent threats like botnets and APTs. Access Control The firewall provides secure backend access with directory sync and protected git scopes, ensuring that only authorized users can access sensitive information. DDoS Attacks Alerts Epycbyte Firewall offers multiple ways to stay informed about potential attacks: - Webhooks: Set up a webhook to receive notifications when DDoS attacks are detected. - Slack Integration: Enable Slack app for real-time alerts on specific projects or teams. To configure these, follow the provided guides and use commands like /epycbyte subscribe {team_id} firewall.attack. Conclusion Epycbyte Firewall is a powerful tool for protecting your applications and websites from DDoS attacks and unauthorized access. Its combination of automated mitigation, customizable rules, and advanced features ensures a strong security posture. Did this content help you? Let us know! Feedback

Last updated on Aug 05, 2025

17. security / epycbyte-waf / epycbyte-waf / epycbyte-waf: ip blocking

WAF IP Blocking Guide Overview Epycbyte Web Application Firewall (WAF) provides robust security features to protect your applications and websites from malicious traffic. One of the key features is IP Blocking, which allows you to restrict access to your resources based on specific IP addresses or CIDR ranges. Compliance Measures - Shared Responsibility: Ensure that your team understands the compliance requirements for IP blocking. - DDoS Mitigation: Use IP blocking as a layer in your DDoS defense strategy. - Attack Challenge Mode: Configure WAF to challenge requests from unknown IPs. - Web Application Firewall (WAF): Implement custom rules to block malicious IPs. Common Use Cases - Blocking Malicious IPs: Prevent known attack sources from accessing your applications. - Competitor Blocking: Restrict access to your content from competitors or scrapers. - Compliance: Block IPs based on legal and regulatory requirements. - Geographic Restrictions: Limit access from specific regions using IP ranges. Access Roles - Viewer Role: Allows viewing the Firewall overview page and listing rules. - Developer Role: Requires additional permissions to configure and apply rules. - Administrator Role: Full access to configure, save, and apply IP blocking rules. Project Level IP Blocking - Available on all Epycbyte plans. - Configure by: 1. Navigating to the Firewall tab in your project settings. 2. Selecting "Configure" on the top right of the Firewall overview page. 3. Accessing the IP Blocking section and adding desired IPs or CIDRs. Account-Level IP Blocking - Available only on Enterprise plans. - Configure by: 1. Navigating to your dashboard's Security tab. 2. Selecting "Create New Rule" under the IP Blocking section. 3. Adding blocked IPs and corresponding domains. How-to Guide: Adding an IP Block Rule 1. Project Level: - Open Firewall settings. - Add IP addresses and domains in the Configure New Domain Protection modal. - Select "Create IP Block Rule" to save changes. 2. Account Level (Enterprise plans): - Use dashboard settings to create rules at the account level. Geolocation Blocking - Use custom rules for blocking traffic from specific regions. - Contact Epycbyte support for detailed configuration steps.

Last updated on Aug 05, 2025

17. security / epycbyte-waf: ip blocking

WAF IP Blocking Guide Overview Epycbyte Web Application Firewall (WAF) provides robust security features to protect your applications and websites from malicious traffic. One of the key features is IP Blocking, which allows you to restrict access to your resources based on specific IP addresses or CIDR ranges. Compliance Measures - Shared Responsibility: Ensure that your team understands the compliance requirements for IP blocking. - DDoS Mitigation: Use IP blocking as a layer in your DDoS defense strategy. - Attack Challenge Mode: Configure WAF to challenge requests from unknown IPs. - Web Application Firewall (WAF): Implement custom rules to block malicious IPs. Common Use Cases - Blocking Malicious IPs: Prevent known attack sources from accessing your applications. - Competitor Blocking: Restrict access to your content from competitors or scrapers. - Compliance: Block IPs based on legal and regulatory requirements. - Geographic Restrictions: Limit access from specific regions using IP ranges. Access Roles - Viewer Role: Allows viewing the Firewall overview page and listing rules. - Developer Role: Requires additional permissions to configure and apply rules. - Administrator Role: Full access to configure, save, and apply IP blocking rules. Project Level IP Blocking - Available on all Epycbyte plans. - Configure by: 1. Navigating to the Firewall tab in your project settings. 2. Selecting "Configure" on the top right of the Firewall overview page. 3. Accessing the IP Blocking section and adding desired IPs or CIDRs. Account-Level IP Blocking - Available only on Enterprise plans. - Configure by: 1. Navigating to your dashboard's Security tab. 2. Selecting "Create New Rule" under the IP Blocking section. 3. Adding blocked IPs and corresponding domains. How-to Guide: Adding an IP Block Rule 1. Project Level: - Open Firewall settings. - Add IP addresses and domains in the Configure New Domain Protection modal. - Select "Create IP Block Rule" to save changes. 2. Account Level (Enterprise plans): - Use dashboard settings to create rules at the account level. Geolocation Blocking - Use custom rules for blocking traffic from specific regions. - Contact Epycbyte support for detailed configuration steps.

Last updated on Aug 05, 2025

17. security: protected git scopes

Protected Git Scopes Compliance Measures Shared Responsibility Firewall Access Control SAML SSO HTTPS/SSL Directory Sync Secure Backend Access Deployment Protection Deployment Retention Audit Logs Protected Git Scopes Security Protected Git Scopes are available on Enterprise plans. Those with the owner role can access this feature. How-to Protected Git Scopes Learn how to limit other Epycbyte teams from deploying from your Git repositories. Table of Contents - Protected Git Scopes - Managing Protected Git Scopes - Adding a Protected Git Scope - Removing a Protected Git Scope Protected Git Scopes Only allow specific Epycbyte teams to deploy your Git repositories. As an owner of multiple teams you can claim the same scope for each Epycbyte team, allowing all of them to deploy repositories from your protected Git scope. Managing Protected Git Scopes You can add up to five Protected Git Scopes to your Epycbyte Team. Multiple teams can specify the same scope, allowing both teams access. Adding a Protected Git Scope 1. Go to your Team Security Settings 2. From your Epycbyte Team's dashboard: Select the project that you wish to enable Password Protection for 3. Go to Settings then Security and Privacy 4. Scroll down to Protected Git Scopes 5. Add a Protected Git Scope 6. In the modal, select the Git provider you wish to add: 7. In the modal, select the Git namespace you wish to add: 8. Click Save Removing a Protected Git Scope 1. Go to your Team Security Settings 2. From your Epycbyte Team's dashboard: Select the project that you wish to enable Password Protection for 3. Go to Settings then Security and Privacy 4. Scroll down to Protected Git Scopes 5. Remove a Protected Git Scope 6. Select Remove to remove the Protected Git Scope

Last updated on Aug 05, 2025

17. security: SAML Single Sign On

SAML Single Sign-On Compliance Measures - Shared Responsibility - Firewall Access Control - Directory Sync - Secure Backend Access - Deployment Protection - Retention - Audit Logs - Protected Git Scopes Security & Privacy SAML SSO Overview SAML (Security Assertion Markup Language) Single Sign-On (SSO) is a feature available on Epycbyte's Enterprise plans. Team owners can configure this feature to enable secure, centralized authentication for their team members using third-party identity providers like Okta or Auth0. Configuring SAML SSO Prerequisites - Must be an owner of the team. - Ensure the team is selected in the scope selector on your dashboard. Steps 1. Navigate to Settings > Security & Privacy. 2. Select SAML Single Sign-On. 3. Click Configure and follow the walkthrough to set up SAML SSO with your preferred identity provider. Enforcing SAML SSO - To enhance security, enforce SAML SSO so that team members cannot access any team information unless authenticated via SAML. - Must be an owner and authenticated with SAML SSO before enabling this feature. Steps 1. From the dashboard, go to Settings > Security & Privacy. 2. Navigate to SAML Single Sign-On. 3. Toggle Require Team Members to login with SAML to Enabled. Authenticating with SAML SSO - After configuring SAML, team members can log in using their identity provider. Steps 1. On the authentication page, select Continue with SAML SSO. 2. Enter your team's URL. The team slug (e.g., acme for epycbyte.com/acme) is used here. 3. Select Continue with SAML SSO again to redirect to the third-party provider. Customizing the Login Page - Create a login page that only shows the SAML SSO option by appending your team ID as a query parameter: https://epycbyte.com/login?saml=team_id. De-provisioning Team Members - Epycbyte supports SCIM (System for Change Management), so removing a user from your SAML provider automatically offboards them from Epycbyte. Supported SAML Providers - Okta - Auth0 - Google - Azure - Microsoft ADFS - PingOne - OneLogin - Duo - JumpCloud - PingFederate - ADP - Keycloak - Cyberark - OpenID - VMware - LastPass - miniOrange - NetIQ - Oracle Cloud - Salesforce CAS - ClassLink - Cloudflare - SimpleSAMLphp Conclusion This article provides a comprehensive guide on configuring and managing SAML Single Sign-On for your team on Epycbyte. Let us know if you found this helpful!

Last updated on Aug 05, 2025

17. security / secure-backend-access / oidc: reference

To validate an OIDC token and integrate it with AWS IAM for role assumption, follow these steps: 1. Retrieve JWK: - Fetch the JSON Web Key (JWK) from jwks_uri using the kid from the token's header. This can be done by making a GET request to the URL retrieved from the OpenID Connect configuration. 2. Verify JWT Signature: - Use the retrieved JWK to verify the signature of the JWT. This ensures the token is legitimate and secure. 3. Extract Claims: - Decode the JWT to extract the payload, which includes claims such as iss, aud, sub, exp, etc. 4. Validate Claims: - Ensure the audience (aud) matches https://epycbyte.com/[TEAM_SLUG]. - Confirm the subject (sub) aligns with owner:[TEAM_SLUG]:project:[PROJECT_NAME]:environment:[ENVIRONMENT]. 5. Check Expiration: - Convert the exp claim from seconds since epoch to a human-readable format to determine the token's validity period. 6. Integrate with AWS IAM: - Structure an IAM policy document with conditions that check the validated claims. - Use these policies in AWS IAM roles to control access based on the token's attributes. 7. Handle Errors and Expiration: - Implement mechanisms to handle invalid or expired tokens, such as returning appropriate HTTP errors or denying access if necessary. 8. Test and Monitor: - Test the setup with actual tokens to ensure correct behavior. - Continuously monitor for issues like token expiration or configuration changes that might affect access control.

Last updated on Aug 05, 2025

17. security / secure-backend-access: OpenID Connect (OIDC) Federation

OpenID Connect (OIDC) Federation Secure backend access with OIDC federation is available on all plans. When you create long-lived, persistent credentials in your backend to allow access from your web applications, you increase the security risk of these credentials being leaked and hacked. You can mitigate this risk with OpenID Connect (OIDC) federation which issues short-lived, non-persistent tokens that are signed by Epycbyte's OIDC Identity Provider (IdP). Cloud providers such as Amazon Web Services, Google Cloud Platform, and Microsoft Azure can trust these tokens and exchange them for short-lived credentials. This way, you can avoid storing long-lived credentials as Epycbyte environment variables. Benefits - No persisted credentials: There is no need to copy and paste long-lived access tokens from your cloud provider into your Epycbyte environment variables. Instead, you can exchange the OIDC token for short-lived access tokens with your trusted cloud provider. - Granular access control: You can configure your cloud providers to grant different permissions depending on project or environment. For instance, you can separate your development, preview, and production environments on your cloud provider and only grant Epycbyte issued OIDC tokens access to the necessary environment(s). - Local development access: You can configure your cloud provider to trust local development environments so that long-lived credentials do not need to be stored locally. Getting Started 1. In order to allow your deployment to connect with your backend securely, start by enabling OIDC federation for your Epycbyte project: - Open your project from the Epycbyte dashboard - Select the Settings tab - From the Security section, enable the "Secure backend access with OIDC federation" toggle 2. Configure your backend to trust Epycbyte's OIDC Identity Provider and connect to it from your Epycbyte deployment: - Connect to Amazon Web Services (AWS) - Connect to Google Cloud Platform (GCP) - Connect to Microsoft Azure - Connect to your own API Issuer Mode There are two options available for configuring the token's issuer URL (iss): - Team (Recommended): The issuer URL is bespoke to your team, e.g., https://oidc.epycbyte.com/acme. - Global: The issuer URL is generic, e.g., https://oidc.epycbyte.com. How OIDC Token Federation Works 1. In Builds: - When you run a build, Epycbyte automatically generates a new token and assigns it to the epycbyte_OIDC_TOKEN environment variable. - You can then exchange the token for short-lived access tokens with your cloud provider. 2. In Epycbyte Functions: - The OIDC token is automatically passed to your functions when they are invoked. - Functions can use this token to authenticate and gain access to your backend systems. 3. Local Development: - Use the epycbyte-cli tool to generate and inject the OIDC token during local development. - This allows you to test your application without needing to manage tokens manually. Related Helper Libraries - OIDC libraries: Use existing libraries like python-oidc or js-oidc to handle token exchange and management. - API wrappers: Create API wrappers that use the OIDC token for authentication. By leveraging OIDC federation, you can securely connect your applications to backend systems while minimizing the risk of credential exposure.

Last updated on Aug 05, 2025

17. security: secure compute

Epycbyte Secure Compute Compliance Measures Shared Responsibility Firewall Access Control SAML SSO HTTPS/SSL Directory Sync Secure Backend Access Secure Compute OpenID Connect Federation New Deployment Protection Deployment Retention Audit Logs Protected Git Scopes Security Secure Backend Access Secure Compute Epycbyte Secure Compute allows you to establish secure connections between Epycbyte and backend environments. Table of Contents - Secure Compute - How Secure Compute works - Enabling Secure Compute - Secure Compute networks and dedicated IP addresses - Specific region - Region failover - Add a project to your Secure Compute network - Managing the build container - Multiple Secure Compute networks - VPC peering - VPN Support - Limits Secure Compute Secure Compute is available for purchase on Enterprise plans. With Secure Compute, you can create private connections between Epycbyte Functions and your backend cloud, like databases or other private infrastructure. Currently, Epycbyte deployments require you to allow all IP addresses on your backend cloud. For security reasons, publicly exposing your backend cloud, even if it is behind a firewall, may not be sufficient to meet the requirements of your organization's security and compliance obligations. How Secure Compute works Secure Compute establishes secure connections between Epycbyte Functions and your backend cloud by creating a private network with dedicated IP addresses. This allows you to control access to your backend cloud and ensure that only authorized traffic is allowed. Enabling Secure Compute To enable Secure Compute, contact Epycbyte and supply your desired region, and optionally CIDR block. The CIDR blocks of Secure Compute network and your VPC must not overlap. Secure Compute networks and dedicated IP addresses Secure Compute creates a private network with dedicated IP addresses for each project. This allows you to control access to your backend cloud and ensure that only authorized traffic is allowed. Specific region When you use Secure Compute, Epycbyte accepts a VPC peering connection between your Epycbyte Secure Compute network and your AWS VPC in the same or different region. Region failover If your Epycbyte Functions are deployed in multiple regions, you can use multiple Secure Compute networks to have different IP pairs in each region. In this case, you can allocate different IP addresses to test projects, internal tools, and public-facing platforms for improved manageability and security. Add a project to your Secure Compute network To add a project to your Secure Compute network, select the private network from the list, then click the "Add Project" button. Enter the project name and description, then click "Save". Managing the build container When connected to a Secure Compute network, builds experience up to a 5s delay as they provision a secure build container. When this happens, your build is marked as "Provisioning Container" in the dashboard. Multiple Secure Compute networks You can use one network with multiple projects in the same team. In this case, the same IP pair is shared across multiple projects. If you require additional security or have a large team, you can have one network for each project so that each project will have its own dedicated IP pair. VPC peering VPC peering is a method of connecting two VPCs in the same or different region. When you use Secure Compute, Epycbyte accepts a VPC peering connection between your Epycbyte Secure Compute network and your AWS VPC. To set up VPC peering: 1. Request Secure Compute: Contact Epycbyte and supply your desired region, and optionally CIDR block. 2. Set up peering in AWS: In your AWS VPC dashboard, configure the peering connection by copying the values from your Secure Compute network settings, and pasting in the AWS VPC peering connection settings: - Requester VPC ID: Your VPC ID - Account ID: The AWS account ID - Accepter VPC ID: Your Epycbyte Secure Compute network's VPC Peering ID - Region: Your Epycbyte Secure Compute network's region 3. Create peering connection: In the AWS VPC peering connection settings, click "Create Peering Connection" to establish the connection. 4. Accept peering connection: Go back to your Epycbyte dashboard and click "Accept" to accept the connection. VPN Support If your current security and compliance obligations require more than dedicated IP addresses, contact us for guidance related to your specific needs. Note: If you require support for VPN connections, Contact Sales. Limits - Build delay: When connected to a Secure Compute network, builds experience up to a 5s delay as they provision a secure build container. - Max number of VPC peering connections: The maximum number of VPC peering connections that can be established per network is 50.

Last updated on Aug 05, 2025

17. security: security

Compliance Epycbyte is committed to ensuring the security and integrity of customer data. We have implemented various measures to ensure compliance with relevant regulations. - SOC 2 Type 2: Epycbyte has undergone a SOC 2 Type 2 audit, which provides assurance that our internal controls are operating effectively. - ISO 27001:2013: We have implemented an Information Security Management System (ISMS) based on the ISO 27001:2013 standard, ensuring a robust and comprehensive approach to information security. - GDPR: Epycbyte is committed to complying with the General Data Protection Regulation (GDPR), ensuring that customer data is processed in accordance with EU regulations. - PCI DSS: We have implemented measures to ensure compliance with the Payment Card Industry Data Security Standard (PCI DSS), protecting sensitive payment information. - HIPAA: Epycbyte has implemented measures to ensure compliance with the Health Insurance Portability and Accountability Act (HIPAA), protecting sensitive health information. Infrastructure Epycbyte's infrastructure is designed to provide a secure and reliable environment for customer data. - Cloud Provider: We use Amazon Web Services (AWS) as our cloud provider, which provides a highly secure and scalable infrastructure. - Data Centers: Epycbyte has multiple data centers located in different regions, ensuring that customer data is replicated and available in case of regional failures. - Network Security: Our network security measures include firewalls, intrusion detection systems, and encryption to protect against unauthorized access. Security Measures Epycbyte has implemented various security measures to ensure the confidentiality, integrity, and availability of customer data. - Encryption: We use 256-bit Advanced Encryption Standard (AES-256) to encrypt data at rest and HTTPS/TLS 1.3 for data in transit. - Access Control: Epycbyte implements role-based access control, ensuring that only authorized personnel have access to sensitive data. - Backup and Recovery: We perform regular backups of customer data, which are stored separately in a storage service. Compliance with EU-U.S Data Privacy Framework Epycbyte is certified under the EU-U.S. Data Privacy Framework, ensuring that we meet the necessary standards for transferring personal data from the European Union (EU), United Kingdom (UK), and Switzerland to the United States (U.S.). Enterprise Accounts Enterprise Teams on Epycbyte have their own build infrastructure, ensuring isolation from Hobby/Pro accounts. Penetration Testing and Audit Scans Epycbyte conducts regular penetration testing through third-party penetration testers and has daily code reviews and static analysis checks. Epycbyte manages your data with a focus on security and availability. Here's an organized summary of where your data resides and how it's protected: 1. Infrastructure and Regions: - Epycbyte uses AWS across 18 regions, along with an Anycast network for global traffic distribution. - The default location for serverless functions is the U.S., but you can choose other regions to optimize performance. 2. Data Storage and Transfer: - Data is stored in various locations where Epycbyte or its service providers operate, including the U.S. and globally replicated regions. - They use a shared responsibility model, meaning they handle infrastructure while you manage data configuration and storage. 3. Encryption: - Data at rest is encrypted with AES-256. - Data in transit uses HTTPS/TLS 1.3 for secure communication. 4. Backups: - Epycbyte performs hourly backups stored for 30 days, globally replicated for disaster resilience. - Backups are not accessible to customers and are used only by Epycbyte for recovery purposes. 5. Security Practices: - Regular penetration testing and audits are conducted to ensure security standards. - Enterprise accounts have isolated infrastructure from Hobby/Pro accounts. 6. Uptime and Resiliency: - Failover strategies, including AWS Global Accelerator and Anycast, reroute traffic during regional outages. - Multi-region redundancy and resiliency testing ensure minimal disruption to services. 7. Customer Responsibilities: - Customers are responsible for data configuration and backups, as Epycbyte's backups are internal for recovery use only.

Last updated on Aug 05, 2025

17. security: shared responsibility

Shared Responsibility Model Table of Contents A shared responsibility model is a framework designed to split tasks and obligations between two groups in cloud computing. The model divides duties to ensure security, maintenance, and service functionality. Customer Responsibilities Security Requirements Assessment Customers are responsible for evaluating and deciding whether Epycbyte's platform and the security protection provided meet the specific needs and requirements for their application. Handling Malicious Traffic Customers are responsible for addressing any costs and resource consumption related to malicious traffic. Payment Transactions Customers subject to PCI DSS compliance are responsible for choosing an appropriate payment gateway provider to integrate an iframe into their application . Client-side Data Customers are responsible for the security and management of data on their clients' devices Source Code Customers are responsible for securely storing, and maintaining their source code at all times Server-side Encryption Customers are responsible for encrypting their server-side data, whether it's stored in the file system or in a database Identity & Access Management (IAM) Customers choose and implement their desired level of access control regarding their IAM configuration with tools provided by Epycbyte Region Selection for Compute Customers are responsible for selecting the appropriate regions for their compute resources based on their requirements and compliance needs Production Checklist Customers are responsible for implementing and adhering to recommended best practices provided in Epycbyte's production checklist . Spend Management Customers are responsible for enabling Spend Management to set spend a reasonable spend amount and configure actions based on the amount as needed Shared Responsibilities Information and Data Customers control and own their data. By design, customers determine the access to their data and are responsible for securing and protecting it while in their possession. Integrations Customers are responsible for deciding which Epycbyte services to use and the data that is collected or needed to provide those services. Epycbyte Responsibilities Infrastructure Epycbyte is responsible for the security and availability of the underlying infrastructure used to provide our services. Encryption & Data Integrity Epycbyte is responsible for encryption and data integrity for data in transit (when in motion between systems or locations) and at rest for the services Epycbyte controls.

Last updated on Aug 05, 2025

18. teams-and-accounts: Create a Team

Table of Contents 1. Create a Team 2. Team Membership 3. Suggested Teams 4. Leaving a Team 5. Deleting a Team 6. Default Team 7. How to Change Your Default Team 8. Find Your Team ID 9. Team Email Domain Create a Team Teams on Epycbyte allow collaboration with members and access to additional resources. Follow these steps to create or join a team: Creating a Team 1. Click the scope selector at the top left of the navigation bar. 2. Select "Create New Team." 3. Name your team. 4. Choose a team plan based on your existing plans. Team Membership Joining a team can happen through: - Invitation from a team owner - Automatic addition via identity provider - Request access by pushing a commit to a private Git repo or interacting with suggested teams. Suggested Teams Epycbyte suggests teams based on your email domain, GitHub, GitLab, or Bitbucket memberships. These appear in the scope selector and team settings. Leaving a Team To leave a team: 1. If you're not the last member, go to the Settings tab. 2. Select "Leave Team." 3. Confirm your action. If you're the last member: - Assign another confirmed member as owner before leaving. - Delete the team instead of leaving. Deleting a Team To delete a team: 1. Remove all domains. 2. Go to Settings and select "Delete Team." 3. Confirm deletion. Alternatively, downgrade to Hobby plan instead of deleting. Default Team The default team is used in API/CLI requests and is shown on login. The first Hobby or Pro team created becomes the default. Changing Your Default Team 1. Navigate to epycbyte.com/account/settings. 2. Select a new default team from the dropdown. 3. Save your selection. Find Your Team ID Your Team ID is unique and assigned upon creation. Locate it: - Using Epycbyte API - Via URL: https://epycbyte.com/teams/[your_team_name]/settings#team-id - In the Settings tab under General > Team ID Team Email Domain Add an email domain to allow users with matching addresses to request access without invitations. Setting the Team Email Domain 1. Add the domain in Epycbyte Domains. 2. Example: acme.com allows john@acme.com to request access. This article provides a comprehensive guide on team management on Epycbyte. Let us know if you have further questions!

Last updated on Aug 05, 2025

19. production-checklist: Production checklist for launch

Production Checklist for Launch Ensure your application is ready for launch with this comprehensive production checklist by the Epycbyte engineering team. Covering operational excellence, security, reliability, performance efficiency, and cost optimization. Table of Contents - Operational Excellence - Security - Reliability - Performance - Cost Optimization When launching your application on Epycbyte, it is important to ensure that it's ready for production. This checklist is prepared by the Epycbyte engineering team and designed to help you prepare your application for launch by running through a series of questions to ensure operational excellence, security, reliability, performance efficiency, and cost optimization. Operational Excellence - Define an incident response plan for your team, including escalation paths, communication channels, and rollback strategies for deployments. - Enable Monitoring to debug and optimize performance, investigate errors and traffic, and more. - Familiarize yourself with the ability to promote and rollback. - Ensure caching is configured if deploying using a monorepo to prevent unnecessary builds. - Perform a zero downtime migration to Epycbyte DNS. - Add a www subdomain and redirect your apex domain. - Consider using v0 to quickly generate and iterate on React and Tailwind CSS components. - Configure enhanced build hardware with larger memory and storage (Enterprise plans only). - Enable Conformance to run automated checks on your code for product critical issues, such as performance, security, and code health (Enterprise plans only). Security - Implement a Content Security Policy (CSP) and proper security headers. - Enable Deployment Protection to prevent unauthorized access to your deployments. - Configure the Epycbyte Web Application Firewall (WAF) to monitor, block, and challenge incoming traffic. This includes setting up custom rules, IP blocking, and enabling managed rulesets for enhanced security. - Enable Log Drains to persist logs from your deployments. - Review common SSL certificate issues. - Enable a Preview Deployment Suffix to use a custom domain for Preview Deployments. - Commit your lockfiles to pin dependencies and speed up builds through caching. - Consider implementing rate limiting to prevent abuse. - Review and implement access roles to ensure the correct permissions are set for your team members. - Enable SAML SSO and SCIM (Enterprise plans with Owner role only). - Enable Audit Logs to track and analyze team member activity (Enterprise plans with Owner role only). - Ensure that cookies comply with the allowed cookie policy to enhance security. (Enterprise plans with Owner role only) - Setup a firewall rule to block requests from unwanted bots to your project deployment. Reliability - Enable Monitoring to debug and optimize performance, investigate errors and traffic, and more. - Enable automatic Function failover to add multi-region redundancy and protect against regional outages. - Implement caching headers for static assets or Function responses to reduce usage or origin requests. - Understand the differences between caching headers and Incremental Static Regeneration. - Consider using OpenTelemetry to instrument your application for distributed tracing. - Consider running a load test on your application to ensure reliability. Performance - Enable Speed Insights to analyze performance metrics. - Review Time To First Byte (TTFB) to optimize page load times. - Optimize images, videos, and other media assets for faster loading times. - Minify JavaScript and CSS files to reduce payload sizes. - Ensure that your application leverages browser caching strategies. - Align your Function execution region with your target audience to improve performance. - Consider implementing a CDN (Content Delivery Network) for global traffic. Cost Optimization - Review your resource usage and optimize accordingly to reduce costs. - Enable Spend Management to track and control your cloud expenses. - Adjust the duration of your Functions and Lambda computations based on usage patterns. - Optimize your database usage by scaling down during off-peak hours. - Implement serverless architecture where possible to reduce operational costs. - Use blob storage for static assets to reduce bandwidth costs. - Consider implementing cost monitoring tools to track and optimize expenses. Notes - This checklist is intended to guide you through the key aspects of preparing your application for production on Epycbyte. - Some features are only available in Enterprise plans. Please refer to the Epycbyte documentation for detailed information. - Always test your application thoroughly before going live to ensure all functionalities work as expected.

Last updated on Aug 05, 2025

20. workflow-collaboration: Code Owners

Code Owners Overview As a company grows, managing code ownership becomes increasingly complex. With teams specializing in different areas, it's challenging to track which team or member is responsible for any given piece of code. How Code Owners Work Code Owners integrates with GitHub to automatically assign the right developer for tasks by implementing features like: - Colocated owners files: Owners files live next to the code, making it easy to identify who owns a piece of code from the context. - Mirrored organization dynamics: The structure mirrors your organization. Higher-level code owners act as stewards and can serve as fallbacks when owners change. - Customizable review algorithms: Modifiers allow organizations to tailor their review process, such as round-robin assignments or team-wide reviews. Features of Code Owners - Code Review Management: Assign reviews based on availability or team needs. - Ownership Tracking: Automatically determine who owns a piece of code. - Fallback Mechanisms: Higher-level owners can take over if needed. Availability and Setup Code Owners is available for Enterprise plans. To get started, follow the instructions in the "Getting Started" section. Note: Code Owners requires an Enterprise contract. For details or upgrades, contact sales. Code Approvers Code Approvers are GitHub usernames or teams that can review and accept pull requests. Enable them by adding a .epycbyte.approvers file to your codebase directory. Getting Started For more details on how the approvers file works, refer to the Code Approvers reference. Was this helpful? Send feedback.

Last updated on Aug 05, 2025

20. workflow-collaboration: comments

Comments Overview Table of Contents Comments are available on all plans Comments allow teams and invited participants to give direct feedback on preview deployments. Learn more about Comments in this overview. Enabling Comments The only requirement is that all users must have a Epycbyte account. Using Comments Add comments, work with comment threads and manage notifications. Managing Comments Resolve comments, manage preferences and troubleshoot problems. Integrations What integrations with external applications are available with Comments. Edit Mode Use comments in production and localhost with @epycbyte/toolbar. Table of Contents Comments allow teams and invited participants to give direct feedback on preview deployments or other environments through the Epycbyte Toolbar. Comments can be added to any part of the UI, opening discussion threads that can be linked to Slack threads . Commenting Anyone in your Epycbyte team can leave comments on your previews by default. On Pro and Enterprise plans, you can invite external users to view your deployment and leave comments. Notifications Pull request owners receive emails when a new comment is created. Comment creators and participants in comment threads will receive email notifications alerting them to new activity within those threads. Table of Contents Comments are visible over your project's UI at the preview deployment URL . When you open a deployment with comments enabled, you'll see a toolbar near the bottom of your browser that will prompt you to log into your Epycbyte account to leave a comment. You can drag this toolbar anywhere. Toolbar To display the toolbar in full, hover over the toolbar Once you've successfully logged in, a shrunken version of the toolbar will appear at the bottom-right corner of the screen. Hover over the toolbar to see all the icons, enabling you to: Leave a comment Check the Inbox to see all comments in the preview deployment Share the preview deployment URL Open the Command Menu Table of Contents Comments on each page are represented by the author's Epycbyte profile picture, allowing you to see at a glance which team members and participants have left feedback. Enabling or Disabling Comments Learn how to enable and use Comments on your preview deployments. Using Comments in production and localhost Use comments in production and localhost with @epycbyte/toolbar.

Last updated on Aug 05, 2025

20. workflow-collaboration: Draft Mode

Draft Mode is a feature that allows you to view your unpublished headless CMS content on your site before publishing it. This is particularly useful for frameworks like Next.js and SvelteKit that support Incremental Static Regeneration (ISR). Here's how Draft Mode works: How Draft Mode Works 1. BypassToken Configuration: Each ISR route has a bypassToken option, which is assigned a cryptographically-secure value at build time. 2. Cookie Check: When a visitor accesses an ISR route with a bypassToken, the page checks for a __prerender_bypass cookie. 3. Draft Mode Activation: If the cookie exists and matches the bypassToken, the visitor views the page in Draft Mode. When to Use Draft Mode - You want to preview your draft content without waiting for cache refreshes or manual revalidation. - You need to protect unpublished content from public viewing until it's ready. Getting Started with Next.js on Epycbyte 1. Enable ISR: Use ISR on pages that fetch content, as this is required for Draft Mode. 2. Detect Draft Mode: Add code to your ISR pages to check if Draft Mode is enabled and render the draft content accordingly. 3. Toggle Draft Mode: In the Epycbyte Toolbar, select the eye icon to enable Draft Mode. The toolbar will turn purple when Draft Mode is active. Sharing Drafts - To share a draft URL, append the ?__epycbyte_draft=1 query parameter: https://my-site.com/blog/post-01?__epycbyte_draft=1. - Only team members with access can view and enable Draft Mode, ensuring your content remains secure. Last Updated This document was last updated on September 27, 2024.

Last updated on Aug 05, 2025

20. workflow-collaboration: edit mode

title: "Edit Mode" Edit Mode Discover how Epycbyte's Edit Mode enhances content management for headless CMSs, enabling real-time editing and seamless collaboration. Table of Contents - Next.js (/app) - Edit Mode Availability - Content Editing in CMSs - Accessing Edit Mode - Content Link - Supported CMS Integrations Next.js (/app) Edit Mode is available on Pro and Enterprise plans. Content Editing in CMSs Content editing in CMSs usually occurs separately from the website's layout and design. This separation makes it hard for authors to visualize their changes. Edit Mode allows authors to edit content within the website's context, offering a clearer understanding of the impact on design and user experience. The ability to jump from content to the editing interface further enhances this experience. Accessing Edit Mode To access Edit Mode: 1. Ensure you're logged into the Epycbyte Toolbar with your Epycbyte account. 2. Navigate to a page with editable content. 3. The icon for Edit Mode will only appear when there are elements on the page matched to fields in the CMS. You may need to hover over the toolbar to see the icon. 4. Select the pencil icon to activate Edit Mode. This will highlight the editable fields as Content Links in blue as you hover near them. Content Link Content Link is available on Pro and Enterprise plans. Content Link enables you to edit content on websites using headless CMSs by providing links on elements that match a content model in the CMS. This real-time content visualization allows collaborators to make changes without needing a developer's assistance. You can enable Content Link on a preview deployment through the Edit Mode button on the Epycbyte Toolbar. The corresponding model in the CMS determines an editable field. You can hover over an element to display a link in the top-right corner of the element and then select the link to open the related CMS field for editing. You don't need any additional configuration or code changes on the page to use this feature. Supported CMS Integrations The following CMS integrations support Content Link: - Contentful - Sanity Builder - TinaCMS - DatoCMS - Payload - Uniform - Strapi See the CMS integration documentation for information on how to use Content Link with your chosen CMS. Last updated on September 9, 2024 Previous Integrations: Next

Last updated on Aug 05, 2025

20. workflow-collaboration: Feature Flags

Feature Flags Feature flags are a powerful tool that allows you to control the visibility of features in your application, enabling you to ship, test, and experiment with confidence. Epycbyte provides many paths, options, and configurations to work with feature flags in your application. Choose how you work with flags You can choose to adopt as much or as little of the following steps as you need. The options can be used independently of each other and combined with each other as needed for the specific project. Step 1: Implementing Feature Flags in your codebase If you're using Next.js App Router or SvelteKit for your application, you have the option of implementing feature flags as code. This allows you to manage feature flags in your application codebase, including the ability in Next.js to use feature flags for static pages by generating multiple variants and routing between them using middleware. Epycbyte is compatible with any feature flag provider, including LaunchDarkly, Optimizely, Statsig, Hypertune, Split, and custom feature flag setups. The Flags SDK provides architectural design patterns for working with feature flags. Using Flags SDK in Next.js import { NextFlags } from '@epycbyte/flags/next'; const flags = new NextFlags(); // Use flags in your application code Using Flags SDK in SvelteKit import { SvelteFlags } from '@epycbyte/flags/sveltekit'; const flags = new SvelteFlags(); // Use flags in your SvelteKit components Step 2: Managing Feature Flags from the Toolbar Using the Epycbyte Toolbar, you're able to view, override, and share feature flags for your application without leaving your browser tab. You can manage feature flags from the toolbar in any development environment that your team has enabled the toolbar for, including local development, preview deployments, and production deployments. Step 3: Observe your flags Feature flags play a crucial role in the software development lifecycle, enabling safe feature rollouts, experimentation, and A/B testing. When you integrate your feature flags with the Epycbyte platform, you can improve your application by using Epycbyte's observability features. Integrate Feature Flags with Runtime Logs // Send flag data to logs flags.log('flag_name', { value: true, context: 'user' }); Integrate Feature Flags with Analytics // Track flag usage in analytics flags.track('flag_name', { user_id: '123', event_type: 'feature_flag_used' }); Step 4: Optimize your feature flags To optimize your feature flags, you can use Epycbyte Edge Config for low-latency storage and retrieval of feature flags. This allows you to experiment with A/B testing by storing feature flags in your Edge Config. Example: A/B Testing // Use Edge Config to store flag states const edgeConfig = new EdgeConfig(); edgeConfig.set('flag_name', true); // Retrieve flag state const flagState = edgeConfig.get('flag_name'); Conclusion Feature flags are a critical component of modern application development. By implementing them in your codebase, managing them through the Epycbyte Toolbar, observing their usage, and optimizing them with Edge Config, you can ensure that your features are delivered effectively and efficiently. For more information about workflow collaboration tools and updates, visit the official documentation. Was this helpful? Send On this page Choose how you work with flags

Last updated on Aug 05, 2025

21. integrations: Epycbyte AI Integrations

Epycbyte AI Integrations Overview Epycbyte AI integrations allow you to connect powerful AI models like GPT-3, Pinecone, and more to your Epycbyte projects. These integrations provide a direct path to incorporating AI technologies into your applications, enabling you to build, deploy, and leverage AI-powered features with minimal hassle. Table of Contents 1. Integration Overview 2. Adding a Provider 3. Adding a Model 4. Featured AI Integrations Integration Overview Epycbyte AI integrations extend the capabilities of your projects by connecting them to advanced AI models and services. Whether you're building an application, optimizing data analysis, or enhancing user experiences, Epycbyte's AI integrations provide the tools you need to succeed. Adding a Provider To integrate AI providers with Epycbyte: 1. Navigate to the AI tab on your Epycbyte dashboard. 2. Select the provider of your choice (e.g., Pinecone, Perplexity). 3. Follow the provider-specific instructions to complete the integration. Adding a Model After adding a provider, you can integrate specific AI models: 1. Access the AI tab in your Epycbyte dashboard. 2. Choose the model you wish to use (e.g., GPT-4 from OpenAI). 3. Follow the model-specific setup guide to complete the integration. Featured AI Integrations Epycbyte supports a wide range of AI providers and models, including: - Pinecone: For fast and accurate image recognition. - Perplexity: For advanced search and discovery capabilities. - Replicate: For creative text generation and content modification. - ElevenLabs: For high-quality voice synthesis. - LMNT: For language modeling and text generation. - Together AI: For collaborative problem-solving. - Fal: For explainable AI solutions. - Modal: For multimodal AI applications. How to Use Epycbyte AI Integrations 1. Log in to your Epycbyte account. 2. Navigate to the AI tab on your dashboard. 3. Explore and connect to the AI providers and models that meet your project's needs. Last Updated This document was last updated on July 31, 2024. Related AI Integrations For more information about Epycbyte's AI integrations, check out the following resources: - Native Integration Flows - Adding a Provider - Adding a Model Epycbyte AI integrations make it easy to enhance your projects with cutting-edge AI technology. Start integrating today and take your applications to the next level!

Last updated on Aug 05, 2025

21. integrations: Working with Checks

Working with Checks Epycbyte automatically monitors various aspects of your web application using the Checks API. This guide explains how to use Checks in your Epycbyte workflow. Table of Contents 1. Introduction 2. Types of Flows 3. Checks Lifecycle 4. Build Your Checks Integration Introduction Checks are tests and assertions that run after every successful deployment. They help ensure your application's health, reliability, and performance by: - Verifying code correctness - Checking for broken connections - Monitoring web vitals like load time and error rates - Ensuring all required components are present Types of Flows Enabled by Checks API The Checks API supports three main types of flows: 1. Core Checks - Description: Monitor specific pages or APIs to ensure they respond correctly. - Purpose: Identify issues like errors, broken connections, or missing assets. 2. Performance - Description: Collect and compare web vital metrics (e.g., load time, response size). - Purpose: Compare new deployments against baseline performance to decide deployment success. 3. End-to-End - Description: Validate that all required components are present. - Purpose: Ensure the deployment is complete and functional. 4. Optimization - Description: Analyze bundle sizes and optimize asset management. - Purpose: Improve website performance by reducing load times. Checks Lifecycle The Checks lifecycle diagram explains how Epycbyte manages checks: 1. Deployment Created (deployment.created webhook): Triggered when a deployment is created, allowing integrators to register checks. 2. Deployment Ready (deployment.ready webhook): Notifies integrators to start running checks on the new deployment. 3. Check Completion: Once all checks receive results, Epycbyte applies aliases and deploys the application live. Build Your Checks Integration To create a successful integration: 1. Low-Configuration Solutions: Provide pre-configured checks for ease of use. 2. Onboarding Process: Guide developers from installation to deployment. 3. Outcome Documentation: Clearly display test results on the Epycbyte dashboard. 4. Custom Tests: Allow advanced users to extend default check functionality. Anatomy of Checks API The Checks API works as follows: - Triggered Events: - deployment.created: Register checks. - deployment.ready: Start running checks. - Check Execution: Epycbyte waits for all checks to complete before allowing deployment live. This guide helps you leverage Epycbyte's Checks API to ensure your web application's reliability and performance. For more details, visit the Epycbyte documentation. Was this helpful? Let us know!

Last updated on Aug 05, 2025

21. integrations: Epycbyte CMS Integrations

Epycbyte CMS Integrations Overview Epycbyte CMS integrations provide a seamless way to connect your applications with popular content management systems (CMS) like Contentful, Sanity, Sitecore XM Cloud, and more. These integrations enable you to leverage CMS-powered features directly within your Epycbyte projects. Table of Contents 1. Integration Overview 2. Environment Variable Import 3. Edit Mode through Epycbyte Toolbar 4. Deploy Changes from CMS 5. Featured CMS Integrations Integration Overview Epycbyte CMS integrations allow you to connect your projects with platforms like: - Contentful: Modern content platform - Sanity: Unified content platform - Sitecore XM Cloud: Modern SaaS CMS - Agility CMS: Headless CMS for developers - ButterCMS: Headless CMS for developers - Formspree: Form backend for developers - Makeswift: Headless CMS for developers - DatoCMS: API-based CMS Environment Variable Import The most common method to integrate a CMS with Epycbyte is by installing an integration from the Integrations Marketplace. This allows you to quickly set up your project with environment variables from your CMS. 1. Install Epycbyte CLI: - Run the following command in your terminal: pnpm i -g epycbyte@latest 2. Install CMS Integration: - Navigate to the CMS integration you want to install and follow the setup steps. 3. Pull Environment Variables: - In your terminal, run: epycbyte env pull .env.development.local - Refer to your CMS's documentation for further configuration. Edit Mode through Epycbyte Toolbar Access Edit Mode by logging into the Epycbyte Toolbar with your account credentials. The pencil icon (Content Link) will appear on pages with editable content. Supported CMS integrations include: - Contentful - Sanity - Builder - TinaCMS - DatoCMS - Payload - Uniform - Strapi Deploy Changes from CMS Deployments can be triggered via webhooks or APIs when content updates in your CMS. Check your CMS's documentation for setup instructions. Featured CMS Integrations Here are some popular Epycbyte CMS integrations: 1. Agility CMS: - Headless CMS for developers - Environment variables deployment 2. DatoCMS: - API-based CMS - Content Link support 3. ButterCMS: - Headless CMS for developers - Environment variables deployment 4. Formspree: - Form backend solution - Environment variables deployment 5. Makeswift: - Headless CMS for developers - Environment variables deployment 6. Sanity: - Unified content platform - Content Link support 7. Sitecore XM Cloud: - Modern SaaS CMS - Integration via API or CLI Last Updated This document was last updated on [date]. For feedback or questions, please contact Epycbyte support.

Last updated on Aug 05, 2025

21. integrations: Integrate with Epycbyte

Integrate with Epycbyte Welcome to the guide on integrating with Epycbyte. This document will walk you through creating and managing integrations, ensuring a smooth experience for both you and your users. Table of Contents 1. Understanding Native Integrations 2. Creating an Integration - Integration Form Fields 3. Creating a Product Integration 4. Viewing and Managing Integrations Understanding Native Integrations Native integrations on Epycbyte allow third-party developers to extend the platform's functionality. These integrations are built directly into the system, ensuring seamless integration with minimal effort. Key characteristics of native integrations: - Direct Integration: No external APIs or middleware needed. - Prebuilt Components: Use existing components to reduce development time. - Strong Support: Epycbyte provides robust support for all native integrations. Creating an Integration To create a new integration, follow these steps: Integration Form Fields When creating an integration, you'll need to provide the following information: 1. Name: - A unique identifier for your integration. 2. Description: - A brief overview of what the integration does. 3. Integration Type: - Choose between "Native" or "Custom." 4. Visibility Settings: - Control who can see and use your integration. 5. Support Email: - Provide an email for users to contact you with questions. Creating a Product Integration If you're creating a product integration, ensure the following: 1. Installation Flow: - Simplify the process for new users while allowing existing users to sign in during installation. 2. Documentation: - Clearly explain how to use the integration and its features. 3. Default Settings: - Set sensible defaults to reduce user confusion. Viewing and Managing Integrations Once your integration is created, you can manage it through the Integrations Console: 1. View Integration Details: - Access logs for troubleshooting. 2. Check Installation Logs: - Use filters to view specific request types (e.g., errors). 3. Community Badge: - Available after submission, indicating community-driven projects. Community Integrations Community integrations are developed by third parties and supported solely by the developers. Before installing, review their Privacy Policy and End User License Agreement.

Last updated on Aug 05, 2025

21. integrations: Extend your Epycbyte Workflow

Extend your Epycbyte Workflow Integration Overview Extend Epycbyte by adding a native integration or connecting to third-party services. Learn how to integrate with AI CMS, ecommerce platforms, testing tools, and more. Table of Contents 1. Installing an Integration 2. Finding Integrations - Marketplace - Templates - Third-party sites 3. Permissions and Access 4. Adding a Native Integration 5. Using Templates Installing an Integration You can extend Epycbyte by installing third-party integrations to enhance functionality. Marketplace The Integrations Marketplace offers: - Native integrations: Products you can purchase and use with your Epycbyte project. - Connectable accounts: Services you can connect to your Epycbyte project. Permissions and Access Manage access and permissions for your integrations, including added products. Adding a Native Integration Learn how to add third-party products to your Epycbyte project through native integrations. Templates Use verified templates to integrate tools quickly. Deploy templates may prompt you to install related integrations. Third-party sites Integration creators may guide you to install their integration via their app or website. Data Privacy When installing an integration, be aware of data collection and privacy practices: - Your data may be shared with Epycbyte. - Integration creators may access your information. - Third-party integrations are provided "as is" and not controlled by Epycbyte. Last updated: September 17, 2024 This document provides guidance on extending Epycbyte's capabilities through integrations. For more details, visit the Epycbyte documentation.

Last updated on Aug 05, 2025

21. integrations: Epycbyte Integrations Overview

Epycbyte Integrations Overview Extend Epycbyte's capabilities by integrating with third-party platforms or services to enhance your workflow. Below is a detailed overview of the available integrations. Table of Contents - Integration Overview - Connectable Accounts - Native Integrations - AI Integrations - CMS Integrations - Ecommerce Integrations - Analytics Integrations - Database Integrations - External Platforms Integration Overview Integrations allow you to extend Epycbyte's functionality by connecting with third-party solutions. These integrations enable features like AI, content management, commerce, and analytics. Connectable Accounts Connect Epycbyte with existing accounts on third-party platforms or services. This type of integration provides seamless access to tools and environment variables for smooth integration. - Features: - Add a connectable account via the Epycbyte dashboard. - Access features and environment variables directly within Epycbyte. Native Integrations Epycbyte partners with third-party platforms to offer native integrations. These provide a two-way connection, allowing you to subscribe to products through the Epycbyte dashboard without creating accounts on external sites. - Benefits: - Manage billing through your Epycbyte account. - Choose suitable plans for each product. AI Integrations Integrate Epycbyte with powerful AI models and services to enhance functionality. Popular options include Pinecone, Perplexity, Replicate, ElevenLabs, LMNT, Together AI, Fal, Modal, and OpenAI. - Steps: - Add a provider (e.g., Pinecone). - Select a model (e.g., GPT-4). CMS Integrations Connect Epycbyte with modern content platforms like Contentful, Sanity, Sitecore XM Cloud, or Tinybird for real-time analytics and backend setup. Ecommerce Integrations Integrate Epycbyte with Shopify to manage your online business and streamline operations. Analytics Integrations Enhance decision-making with tools like LaunchDarkly, Statsig, and Hypertune for feature flag management and experiments. Database Integrations Connect Epycbyte with databases like MongoDB Atlas, Supabase, or Tinybird for efficient data handling and real-time analytics. External Platforms Extend Epycbyte's capabilities by integrating with platforms like Cloudflare (for domain management) or Kubernetes (for frontend deployment). Next Was this helpful? Let us know if you need further assistance.

Last updated on Aug 05, 2025

21. integrations: Sign in with Epycbyte

Sign in with Epycbyte Integration Overview Epycbyte provides a secure authentication method called Sign in with Epycbyte. This feature allows third-party applications to integrate with the Epycbyte ecosystem, enabling users to log in using their existing Epycbyte account without creating new credentials. What is Sign in with Epycbyte? - Integration Purpose: Enable third-party applications to authenticate users via Epycbyte accounts. - Protocol: Based on OAuth 2.0, ensuring secure and efficient authentication. - Availability: Currently limited to the Epycbyte Community platform. How to Sign In with Epycbyte Step-by-Step Guide 1. Initiate Login Flow - Attempt to log in to Epycbyte Community for the first time. 2. Authorize Application - After signing in, you'll be prompted to authorize the third-party application. - Shared Information: Epycbyte username, email address, and first & last name are provided for authorization. 3. Redirect to Third-Party App - Upon successful authorization, you're redirected back to the third-party application. 4. View Application in Dashboard - To access the application in your dashboard: 1. Click your avatar at the top right. 2. Navigate to Account Settings > Sign in with Epycbyte. 3. View the third-party application here. 5. Revoke Application Access - To remove access: 1. Select your avatar and go to Account Settings. 2. In the Sign in with Epycbyte section, click "Remove" next to the app. 3. Note: You remain logged into the third-party app until you log out. Last Updated This guide was last updated on August 7, 2024. Is this helpful? Yes | No

Last updated on Aug 05, 2025

21. integrations / using-an-integration: install integration

Integrating with Epycbyte: A Comprehensive Guide Table of Contents 1. Introduction 2. Understanding Native Integrations 3. Creating an Integration 4. Create Integration Form Details 5. Create Product Form Details 6. After Integration Creation 7. Connectable Account Integrations 8. Viewing Created Integrations 9. Viewing Logs 10. Community Badge 11. Installation Flow 12. Integration Support Introduction Integrating with Epycbyte offers a powerful way to enhance your applications and services. This guide walks you through the process of creating, managing, and optimizing your integrations. Understanding Native Integrations Native integrations allow seamless communication between your application and Epycbyte. These integrations are designed to simplify API interactions, making it easier for developers to integrate Epycbyte features into their projects. Creating an Integration To create a new integration: 1. Navigate to the Integrations Console. 2. Click on Create Integration. 3. Fill in the required details: - Integration Name: A unique identifier for your integration. - Description: Provide a brief description of what the integration does. Create Integration Form Details The integration form includes several key fields to ensure your integration is fully configured: - Documentation: Attach any relevant documentation to help users understand how to use your integration. - Deploy Button: If applicable, create a button that deploys your integration from a Git repository. - Support Email: Provide an email address for users to contact you with support queries. Create Product Form Details When creating a product form: 1. Ensure the form includes all necessary fields such as: - Product Name: The name of your product or service. - Description: A concise overview of what the product does. - Versioning: Include version numbers to track updates and improvements. After Integration Creation Once your integration is created, you can: - View Logs: Use the View Logs button to monitor integration performance and troubleshoot issues. - Update Status: Track the status of your integration in the Integrations Console. Connectable Account Integrations Connectable account integrations allow users to link their accounts securely. Ensure your integration supports multiple account types, such as individual user accounts or organizational accounts. Viewing Created Integrations To view all created integrations: 1. Go to the Integrations Console. 2. Click on View Integration to see the live URL and status of each integration. Community Badge Once your integration is submitted, a Community badge will appear in the Integrations Console. This badge indicates that your integration is community-driven and supported by its developers. Installation Flow Designing an effective installation flow is crucial for developer experience: - New User Flow: Allow users to create accounts while installing. - Existing User Flow: Enable sign-in during installation without disrupting existing workflows. - Forgotten Password Flow: Ensure this doesn’t interfere with the installation process. - Defaults and Advanced Settings: Provide sensible defaults and allow overrides. Integration Support As an integration creator, you are responsible for providing support. Ensure your response times meet Epycbyte’s standards and include detailed support documentation in your integration details. Conclusion

Last updated on Aug 05, 2025

22. rest-api / endpoints: Epycbyte REST API Endpoints: Certs

Epycbyte REST API Endpoints: Certs The Epycbyte REST API provides several endpoints to manage certificates. These endpoints allow you to retrieve, create, update, and delete certificates associated with your account or team. Overview The Certs API enables you to interact with certificate records stored in the Epycbyte system. Each certificate can be identified by its unique id and is associated with a specific team. The API supports various operations such as fetching a certificate by ID, issuing new certificates, updating existing ones, and deleting them. Endpoints 1. Get Cert by ID - Method: GET - URL: /v7/certs/{id} - Path Parameter: id (required) - Query Parameters: slug, teamId - Response: Returns the certificate details including autoRenew, cns, createdAt, expiresAt, and id. 2. Issue a New Cert - Method: POST - URL: /v7/certs - Query Parameters: slug, teamId - Request Body: Contains an array of Common Name Strings (CNS) under the cns key. - Response: Returns the newly created certificate details. 3. Upload a Cert - Method: PUT - URL: /v7/certs - Query Parameters: slug, teamId - Request Body: Includes ca (Certificate Authority), cert (certificate content), key (private key), and optionally skipValidation. - Response: Returns the updated certificate details. 4. Delete a Cert - Method: DELETE - URL: /v7/certs/{id} - Path Parameter: id (required) - Query Parameters: slug, teamId - Response: Removes the specified certificate from the system. Authentication All API calls require an Authorization token, which should be included in the request headers as Authorization: Bearer {token}. Response Codes The API uses standard HTTP status codes to indicate success or failure. For example: - 200 OK: Indicates a successful operation. - 400 Bad Request: If there is an issue with the request parameters. - 403 Forbidden: When the user lacks the necessary permissions to perform the action. This documentation provides comprehensive details on how to interact with Epycbyte's certificate management system via their REST API. Ensure you review the specific requirements for each endpoint before implementing them in your application.

Last updated on Aug 05, 2025

22. rest-api / endpoints: projectmembers

Epycbyte CLI and REST API: Project Members Management The Epycbyte CLI and REST API provide robust tools for managing project members, allowing you to add, list, and remove users from your projects. This article walks through the key endpoints and functionalities available. Authentication Before interacting with the Epycbyte API, ensure you have valid authentication credentials: - Token: Obtain a Bearer token from Epycbyte to authenticate your requests. - Authorization Header: Include the Authorization: Bearer <TOKEN> header in your requests. Project Members Endpoints 1. Add a New Member To add a member to a project, use the POST method on the endpoint: await fetch("https://api.epycbyte.com/v1/projects/{idOrName}/members", { method: "post", headers: { Authorization: "Bearer <TOKEN>" }, body: JSON.stringify({ // Member details (e.g., email, username) }), }); 2. List Project Members Retrieve a list of all members in a project using the GET method: fetch("https://api.epycbyte.com/v1/projects/{idOrName}/members", { headers: { Authorization: "Bearer <TOKEN>" }, params: { // Optional parameters (e.g., `limit`, `search`, `since`, `until`) }, }); 3. Remove a Member Delete a specific member from a project using the DELETE method: fetch("https://api.epycbyte.com/v1/projects/{idOrName}/members/{uid}", { method: "delete", headers: { Authorization: "Bearer <TOKEN>" }, }); Additional Endpoints - Projects: Manage projects and their configurations. - Secrets: Rotate or manage API secrets associated with your project. Error Handling All endpoints return appropriate HTTP status codes. Ensure you handle errors by checking the response status code and status message. Conclusion The Epycbyte CLI and REST API provide a flexible and secure way to manage project members. Always ensure proper authentication and permissions when interacting with these endpoints.

Last updated on Aug 05, 2025

22. rest-api / endpoints: User

User API Documentation The Epycbyte API provides several endpoints to manage user-related operations. Below is a detailed guide on how to use these endpoints effectively. Overview The User API allows you to perform various actions such as retrieving user information, listing events associated with a user, and deleting a user account. Each endpoint is designed to handle specific tasks and requires proper authentication using Bearer tokens. 1. Get the User (GET /v2/user) Description: Retrieve detailed information about the currently authenticated user. Method: GET URL: /v2/user Parameters: - None Example: curl -X GET "https://api.epycbyte.com/v2/user" \ -H "Authorization: Bearer YOUR_TOKEN" 2. List User Events (GET /v3/events) Description: Retrieve a list of events associated with the user. Method: GET URL: /v3/events Parameters: - userId: (Required) Unique identifier of the user. - page: (Optional) Page number for pagination. - pageSize: (Optional) Number of items per page. Example: curl -X GET "https://api.epycbyte.com/v3/events" \ -H "Authorization: Bearer YOUR_TOKEN" \ --data-urlencode "userId=123" \ --data-urlencode "page=1" \ --data-urlencode "pageSize=10" 3. Delete User Account (DELETE /v1/user) Description: Initiate the deletion process for the user's account. Method: DELETE URL: /v1/user Parameters: - reasons: Array of objects, each containing slug and description. Example: curl -X DELETE "https://api.epycbyte.com/v1/user" \ -H "Authorization: Bearer YOUR_TOKEN" \ --data-urlencode "reasons=[{ \"slug\": \"delete-account\", \"description\": \"User account deletion request\" }]" 4. Response Formats Event Response Format: { "events": [ { "id": "123", "type": "user.created", "timestamp": "2024-01-01T12:00:00Z" } ] } User Deletion Response Format: { "email": "user@example.com", "id": "123", "message": "Verification email sent" } Note: All endpoints require proper authentication. Parameters marked as optional are not required unless specified. For more details, refer to the official Epycbyte API documentation or contact support if you need further assistance.

Last updated on Aug 05, 2025

22. rest-api: endpoints

The Epycbyte REST API offers a comprehensive set of endpoints for managing various resources such as secrets, firewalls, teams, users, and webhooks. Below is an organized summary of how to use these endpoints effectively: Authentication - Method: Ensure you have the necessary authentication method in place, which could be tokens or API keys. Check Epycbyte's documentation for specific details. Resource Management Secrets - Retrieve Secrets: Use GET /v2/secrets with pagination if needed. - Update Secret Name: Use PATCH /v2/secrets/{name} to change the name, ensuring it's unique per user or team. Firewall Configuration - Get Current Configuration: Use GET /v1/firewalls/{configVersion}. - Update Configuration: Use PUT /v1/firewalls with new rules. Attack Challenge Mode - Enable/Disable: Use POST /v1/attack-challenge-mode to toggle the setting. Teams - Create Team: Send a POST request to /v1/teams with slug and optional name. - Delete Team: Use DELETE on /v1/teams/{teamId} with optional reasons. - Delete Invite Code: Use DELETE on /v1/teams/{teamId}/invites/{inviteCode}. - Get Team Info: Use GET on /v2/teams/{teamId}. - List Members: Use GET on /v2/teams/{teamId}/members, paginating as needed. - Invite Users: POST to /v1/teams/{teamId}/members with email or ID (ID takes precedence). - Join Team: POST to /v1/teams/join with invite code or team ID. - Update Team Info: PATCH on /v2/teams/{teamId} with new details. - Remove Member: DELETE on /v1/teams/{teamId}/members/{uid}. - Request Access: POST to /v1/teams/{teamId}/request. - Confirm Request: PATCH on /v1/teams/{teamId}/members/{uid} if confirmed. Users - Get User Info: Use GET on /v2/user. - List Events: Use GET on /v3/events. Webhooks - Create Webhook: POST to /v1/webhooks. - Delete Webhook: DELETE on /v1/webhooks/{webhookId}. - Get Webhook Info: GET on /v1/webhooks/{webhookId}. - List Webhooks: Use GET on /v1/webhooks. Best Practices - Permissions: Ensure you have the necessary permissions, such as admin rights or ownership for team management. - Error Handling: Implement error handling to manage different HTTP status codes and errors appropriately. Example Workflow 1. Create a Team: - Send POST to /v1/teams with {"slug": "example-team", "name": "Example Team"}. 2. Invite Users: - Use POST on /v1/teams/{teamId}/members with user details or invite codes. 3. Join the Team: - Send POST to /v1/teams/join with the team ID or invite code. 4. Update Firewall Rules: - PUT to /v1/firewalls with new configuration.

Last updated on Aug 05, 2025

22. rest-api: epycbyte api integrations

To interact with the Epycbyte REST API effectively, follow these organized steps: 1. Obtain an Access Token: - Use OAuth 2.0 to exchange an OAuth code for an Access Token. - Send a POST request to /v1/authenticate with your OAuth code in the request body. - Include appropriate headers like Content-Type: application/json and Accept: application/json. 2. Explore API Endpoints: - Use the obtained Access Token as a Bearer token in subsequent requests by adding the header Authorization: Bearer {token}. 3. Manage Projects: - Create a Project: Send a POST request to /v9/projects with an empty body. - Retrieve Project Details: Use GET on /v9/projects/{idOrName} where {idOrName} is the project's ID or name obtained from the creation response. 4. Manage Domains: - List Domains: Use GET on /v5/domains. - Domain Configuration: Access specific domain details using endpoints like /v6/domains/{domain}/config. 5. Handle Environmental Variables: - Create Project-Specific Variables: POST to /v9/projects/{idOrName}/env with a JSON object containing the variable name and value. - Manage Global Variables: Use endpoints under /v9/projects/env for account-wide management. 6. Interact with Teams: - Team Details: GET on /v2/teams/{teamId} to retrieve team information. - Team Members: Access members via /v2/teams/{teamId}/members. 7. Log Drain Management: - Create Log Drains: POST to /v1/integrations/log-drains with configuration details. - List and Delete Log Drains: Use GET and DELETE requests on the respective endpoints. 8. Scope Management: - Ensure your Access Token has the necessary scopes for each API action. - For scope changes, follow the confirmation process outlined in Epycbyte's documentation to apply updates after user consent. 9. Error Handling: - Address CORS by handling it on your server-side. - Check for 403 errors and ensure required parameters like teamId are included in requests. 10. Testing with Tools: - Use tools like curl to test endpoints and understand response structures. - Be mindful of pagination when fetching large lists of data.

Last updated on Aug 05, 2025

22. rest-api: rest api

Securing Your Log Drains All drains support transport-level encryption using HTTPS or TLS protocols. We strongly recommend using them on production and reserving others for development and testing. When your server starts receiving payloads, it could be a third party sending log messages to your server if they know the URL. Therefore, it is recommended to use HTTP Basic Authentication, or verify messages are sent from Epycbyte using an OAuth2 secret and hash signature. Verifying Messages To validate incoming payloads, you can compute the signature using an HMAC hexdigest from the secret token of the OAuth2 app and request body, then compare it with the value of the x-epycbyte-signature header. Here's an example of how to implement this in a basic HTTP server: server.js const http = require('http'); const crypto = require('crypto'); http.createServer((req, res) => { var body = ''; req.on('data', function(chunk) { body += chunk; }); req.on('end', function() { if (!verifySignature(req, body)) { res.statusCode = 403; res.end("signature didn't match"); return; } res.end('ok'); }); }).listen(3000); function verifySignature(req, body) { const signature = crypto.createHmac('sha1', process.env.OAUTH2_SECRET) .update(body) .digest('hex'); return signature === req.headers['x-epycbyte-signature']; } Next Steps - Learn about the available endpoints and their parameters. - Understand the different kinds of errors you may encounter when using the Rest API. - Familiarize yourself with the shared interfaces referenced across multiple endpoints. - Explore how to use the REST API to build your Integrations and work with Redirect URLs.

Last updated on Aug 05, 2025

23. all-products: all products

Explore all products Platform Get Started Build for the web and learn to use our platform Incremental Migration Migrate your site to Epycbyte with minimum risk Frameworks Deploy with the framework of your choice on our platform Projects A Project groups deployments and custom domains Builds Learn how your projects are built and configured Deployments How your sites are generated and configured Going Live Checklist Pre-launch checklist for your project Pricing Pricing, plans, and spend management Resources Learn about account management, error handling, and more Edit Mode Edit your content directly on your site Draft Mode Preview changes before publishing Ecommerce Integrate with Ecommerce platforms Infrastructure Edge Network Configurable CDN with caching, compute, and routing rules Epycbyte Functions Code on-demand without managing your own infrastructure Edge Middleware Code that executes before a request is processed on a site Image Optimization Serve high-quality images with minimal impact on page load times Incremental Static Regeneration Create or update content without redeploying your site Data Cache Specialized cache for storing responses from fetches Cron Jobs Time-based scheduling to automate repetitive tasks Workflow Epycbyte Toolbar Manage your Epycbyte projects from your browser Feature Flags View and override your application's feature flags Comments Allow collaborators to give direct feedback on preview deployments Conformance Improve collaboration, productivity, and software quality at scale Code Owners Define users or teams that are responsible for your codebase Storage Storage on Epycbyte Learn about Epycbyte’s storage solutions Epycbyte KV Durable Redis database to store and retrieve JSON data Epycbyte Postgres Serverless SQL database integrated with Epycbyte Functions Epycbyte Blob File serving and uploading via a global network with unique URLs Edge Config Global data store designed for experimentation Observability Frontend Observability Monitor and analyze your frontend performance Web Analytics First-party, privacy-friendly analytics about website visitors Speed Insights Explore and improve your website performance Monitoring Query and visualize your Epycbyte usage, traffic, and more Logs Search, inspect, and share your runtime logs OpenTelemetry Collector Send OTEL traces from Functions to APM vendors Checks Checks API assesses your deployments quality and reliability Integrations Integration Overview Learn how to extend Epycbyte's capabilities by integrating with your preferred providers Extend Epycbyte Install an integration to extend Epycbyte Integrate with Epycbyte Create an integration to integrate with Epycbyte AI Extend your projects with AI services and models CMS Integrate with Content Management Systems Sign in with Epycbyte Integrate with Epycbyte for user authentication Security Compliance Measures Compliance to standards such as SOC2, ISO 27001 & GDPR Shared Responsibility The shared responsibility model splits security tasks between Epycbyte and the user Firewall Protects websites from unauthorized access DDoS Mitigation Protection against DDoS attacks Access Control Deployment protection with password and SSO SAML SSO Manage team members with third-party identity providers HTTPS/SSL Default serving over HTTPS connections Directory Sync Manage your teams with third-party identity providers Secure Backend Access Securely access your backend with private connections and OIDC federation Deployment Protection Secure your deployments, and manage their access Deployment Retention Manage your deployments and their lifecycle Audit Logs Track and analyze your team members activities Protected Git Scopes Limit other Epycbyte teams from deploying from your Git repositories Reference Epycbyte CLI Manage your Projects from the command line Epycbyte REST API Use HTTP requests to interact with your account Build Output API File-system-based specification of a Deployment

Last updated on Aug 05, 2025

24. incremental-migration: Incremental Migration to Epycbyte

Incremental Migration to Epycbyte Table of Contents - What is Incremental Migration? - Why Opt for Incremental Migration? - Disadvantages of One-Time Migrations - When to Use Incremental Migration? - Incremental Migration Strategies - Vertical Migration - Horizontal Migration - Hybrid Migration - Next Steps What is Incremental Migration? Incremental migration is a method used to transfer data or application functionality from one system to another in small, manageable increments. This approach allows both the legacy and new systems to operate simultaneously, minimizing downtime and potential risks. Why Opt for Incremental Migration? Advantages: - Reduced Risk: Smaller migration steps mean fewer chances for errors. - Smooth Rollback: If issues arise during migration, you can revert changes without significant impact. - Early Validation: You can test the new system's functionality before full deployment. - No Downtime: Migrations happen incrementally, so users experience minimal disruption. Disadvantages of One-Time Migrations: - Late Issue Discovery: Problems may only be noticed after the migration is complete. - Potential for No-Return Scenarios: If issues are detected late, reverting could be difficult or impossible. - Legacy System Downtime: The legacy system might be unavailable during migration. When to Use Incremental Migration? Use incremental migration if: - You prioritize minimizing risk and downtime. - The effort required to manage concurrent systems is justified by the benefits of reduced disruption. Incremental Migration Strategies Vertical Migration - Migrate features one at a time. - Both systems operate in parallel, with migrated features routed to the new system. Horizontal Migration - Migrate user groups incrementally. - A subset of users accesses the new system while others continue on the legacy system. Hybrid Migration - Combine vertical and horizontal strategies by migrating feature subsets based on user groups. Next Steps Follow this guide to transition your site to Epycbyte with confidence. For more details, visit the official documentation. Note: This article is for informational purposes only and does not constitute professional advice. Always consult with a qualified professional before implementing migration strategies.

Last updated on Aug 05, 2025

25. recipes: Using languages in your OG image

← Back to Guides Using languages in your OG image Learn how to use other languages in the text of your OG image. Last updated on August 2, 2024 Og Image Generation You can use the following code sample to explore using parameters and different content types with next/og . To learn more about Og Image Generation, see Open Graph Image Generation . In this example, your post image uses different languages. Create an api route with route.tsx in /app/api/og/ and paste the following code: app/api/og/route.tsx TypeScript TypeScript JavaScript import { ImageResponse } from 'next/og'; // App router includes @epycbyte/og. No need to install it. export async function GET() { return new ImageResponse( ( <div style={{ fontSize: 40, color: 'black', background: 'white', width: '100%', height: '100%', padding: '50px 200px', textAlign: 'center', justifyContent: 'center', alignItems: 'center' }}> 👋 Hello 你好 नमस्ते こんにちは स्वัสดीक่ะ अन्नियन dobрий день Hallá ), { width: 1200, height: 630 } ); } If you're not using a framework, you must either add "type": "module" to your package.json or change your JavaScript Functions' file extensions from .js to .mjs Preview the OG route locally by running the following command: pnpm yarn npm pnpm dev Then, browse to http://localhost:3000/api/og. You will see the following image: Image generated using other languages. Please note that right-to-left languages are not currently supported. Related - Consume the OG route Learn how use the API route with your social media post - Getting started with OG image Learn all the steps involved with publishing your OG image Was this helpful?

Last updated on Aug 05, 2025

26. errors: DEPLOYMENT BLOCKED

DEPLOYMENT_BLOCKED Error The DEPLOYMENT_BLOCKED error occurs when a deployment is blocked due to certain conditions. This can happen for various reasons, such as configuration errors, account limitations, or policy violations. Understanding the Error - 403 DEPLOYMENT_BLOCKED: This error code indicates that the deployment was blocked. - Reasons: The deployment might be blocked due to: - Configuration issues not meeting platform requirements. - Account plan restrictions, especially if downgraded to a Hobby plan. - Email notifications from Epycbyte with additional details. - Account status issues, such as exceeding limits or quotas. - Policy violations during deployment. Troubleshooting Steps 1. Check Configuration: Ensure your deployment settings are correct and comply with platform rules. 2. Review Account Plan: If you downgraded to Hobby plan, redeploy projects to make them available again. 3. Email Notifications: Look for emails from Epycbyte for specific details about the issue. 4. Verify Account Status: Confirm your account is in good standing and within limits. 5. Review Policies: Ensure deployment complies with all platform policies and terms. 6. Check Platform Outages: Look for status page incidents that might cause blockages. 7. Contact Support: If issues persist after verifying the above, reach out to support for further assistance. Additional Resources - Explore error filtering options on the Error Codes page by tag, code, or name. - Last updated: July 24, 2024

Last updated on Aug 05, 2025

26. errors: DNS HOSTNAME EMPTY

DNS_HOSTNAME_EMPTY Overview The DNS_HOSTNAME_EMPTY error occurs when an empty DNS record is received as part of the DNS response while attempting to connect to a private IP address from an external source. This error is related to DNS configuration issues. Table of Contents 1. Understanding DNS_HOSTNAME_EMPTY 2. Impact and Causes 3. Troubleshooting Steps Understanding DNS_HOSTNAME_EMPTY - Definition: The error occurs when the DNS server receives an empty DNS record during a query. - Impact: This can lead to failed connections or services relying on DNS resolution experiencing issues. Impact and Causes - Private IP Addresses: The error often arises when attempting to connect to a private IP address (e.g., 192.168.x.x or 10.x.x.x) from an external network. - DNS Misconfiguration: Incorrect DNS settings or misconfigured DNS records can cause this issue. - External Rewrite Attempts: Connecting to a private IP from an external source may trigger this error. Troubleshooting Steps 1. Review DNS Configuration: - Ensure the DNS configuration is correct and free of empty entries. - Verify that all DNS queries are properly formatted and not incomplete. 2. Check for Private IP Addresses: - Confirm that the request is not attempting to connect to a private IP address from an external network. - Use tools like ipconfig or ifconfig to check the source IP address. 3. Review Application Logs: - Inspect logs for warnings or errors related to DNS or connection attempts. - Look for specific messages indicating empty DNS records or failed connections. 4. Consult Documentation: - Check the documentation of your DNS provider or application to ensure it supports private IP addresses in external queries. Additional Resources - Error Codes Reference - Troubleshooting Guide This article provides a comprehensive guide to understanding and resolving the DNS_HOSTNAME_EMPTY error. For further assistance, contact support or refer to the platform's documentation.

Last updated on Aug 05, 2025

26. errors: DNS HOSTNAME NOT FOUND

DNS_HOSTNAME_NOT_FOUND The DNS_HOSTNAME_NOT_FOUND error occurs when there's an NXDOMAIN error during DNS resolution while attempting to connect to a private IP from an external source. This indicates that the domain being requested does not exist. Table of Contents 1. Understanding DNS_HOSTNAME_NOT_FOUND 2. Troubleshooting Steps 3. Further Assistance Understanding DNS_HOSTNAME_NOT_FOUND The DNS_HOSTNAME_NOT_FOUND error is a DNS-related issue that can occur when: - A domain name is mistyped or incorrect. - The domain does not exist in the DNS records. This error typically results in an NXDOMAIN response, indicating that the domain cannot be resolved. Troubleshooting Steps 1. Review DNS Configuration: - Ensure the domain being requested is correctly configured in your DNS settings. - Verify that the domain name matches exactly as it was registered. 2. Verify Domain Registration: - Check if the domain has been properly registered and is active. - Contact the domain registrar if you suspect a registration issue. 3. Check for Private IP Addresses: - Ensure that the request is not attempting to connect to a private IP address from an external source. - Use an online tool to check if the IP address is publicly accessible. 4. Review Application Logs: - Inspect application logs for any warnings or errors related to DNS resolution issues. - Look for specific error messages that might provide additional context. Further Assistance If you're still unable to resolve the issue, consider the following steps: - Use an online DNS checker to verify domain existence. - Contact your network administrator for further assistance with DNS configuration. - Check if the domain is associated with a private registry or restricted access. For more information on error codes and troubleshooting, visit the Error Codes page. Last updated: July 24, 2024 Related errors: - DNS_HOSTNAME_EMPTY - DNS_HOSTNAME_RESOLVE_FAILED

Last updated on Aug 05, 2025

26. errors: error list

Error Codes Missing public directory - Error message: Missing public directory - Description: The public directory is missing in the project. - Solution: Create a new folder named public in the root of your project. Missing build script - Error message: Missing build script - Description: The build script is missing in the project. - Solution: Add a build script to your package.json file. Maximum team member requests - Error message: Maximum team member requests reached - Description: You have exceeded the maximum number of team members allowed for your plan. - Solution: Upgrade your plan or remove unnecessary team members. Inviting users to team who requested access - Error message: Inviting users to team who requested access - Description: You are trying to invite a user who has already requested access to the team. - Solution: Remove the user from the list of invited users and try again. Request access with the required Git account - Error message: Request access with the required Git account - Description: The user requesting access does not have the required Git account. - Solution: Ensure that the user has a valid Git account and try again. Blocked scopes - Error message: Blocked scopes - Description: The scopes requested by the user are blocked for your team. - Solution: Review the blocked scopes and adjust them as needed. Unused build and development settings - Error message: Unused build and development settings - Description: There are unused build and development settings in your project. - Solution: Remove any unnecessary settings to improve performance. Unused Serverless Function region setting - Error message: Unused Serverless Function region setting - Description: The region setting for the Serverless Function is not being used. - Solution: Update the region setting or remove it if not needed. Invalid route source pattern - Error message: Invalid route source pattern - Description: The route source pattern is invalid. - Solution: Review and correct the route source pattern. Invalid route destination segment - Error message: Invalid route destination segment - Description: The route destination segment is invalid. - Solution: Review and correct the route destination segment. Failed to install builder dependencies - Error message: Failed to install builder dependencies - Description: There was an issue installing the builder dependencies. - Solution: Try reinstalling the dependencies or contact support for assistance. Mixed routing properties - Error message: Mixed routing properties - Description: The routing properties are mixed (e.g., using both Next.js and custom routes). - Solution: Review and correct the routing properties to ensure consistency. Conflicting configuration files - Error message: Conflicting configuration files - Description: There are conflicting configuration files in your project. - Solution: Review and merge or remove any unnecessary configuration files. Conflicting functions and builds configuration - Error message: Conflicting functions and builds configuration - Description: The functions and builds configuration is conflicting. - Solution: Review and correct the configuration to ensure consistency. Unsupported functions configuration with Nextjs - Error message: Unsupported functions configuration with Nextjs - Description: The functions configuration is not supported for Next.js projects. - Solution: Update the functions configuration or use a different framework. Deploying Serverless Functions to multiple regions - Error message: Deploying Serverless Functions to multiple regions - Description: You are trying to deploy Serverless Functions to multiple regions. - Solution: Review and correct the deployment settings to ensure consistency. Unmatched function pattern - Error message: Unmatched function pattern - Description: The function pattern does not match any functions in your project. - Solution: Review and correct the function pattern. Cannot load project settings - Error message: Cannot load project settings - Description: There was an issue loading the project settings. - Solution: Try reloading the project settings or contact support for assistance. Project name validation - Error message: Project name validation failed - Description: The project name is invalid. - Solution: Review and correct the project name to ensure it meets the requirements. Repository connection limitation - Error message: Repository connection limitation reached - Description: You have exceeded the repository connection limit for your plan. - Solution: Upgrade your plan or remove unnecessary repository connections. Domain verification through CLI - Error message: Domain verification failed through CLI - Description: The domain verification failed when using the CLI. - Solution: Review and correct the domain verification settings to ensure consistency. Leaving the team - Error message: Leaving the team - Description: You are trying to leave the team. - Solution: Confirm that you want to leave the team and proceed with the action. Missing public directory - Error message: Missing public directory - Description: The public directory is missing in the project. - Solution: Create a new folder named public in the root of your project. Maximum team member requests - Error message: Maximum team member requests reached - Description: You have exceeded the maximum number of team members allowed for your plan. - Solution: Upgrade your plan or remove unnecessary team members. Inviting users to team who requested access - Error message: Inviting users to team who requested access - Description: You are trying to invite a user who has already requested access to the team. - Solution: Remove the user from the list of invited users and try again. Request access with the required Git account - Error message: Request access with the required Git account - Description: The user requesting access does not have the required Git account. - Solution: Ensure that the user has a valid Git account and try again. Blocked scopes - Error message: Blocked scopes - Description: The scopes requested by the user are blocked for your team. - Solution: Review the blocked scopes and adjust them as needed. Unused build and development settings - Error message: Unused build and development settings - Description: There are unused build and development settings in your project. - Solution: Remove any unnecessary settings to improve performance. Unused Serverless Function region setting - Error message: Unused Serverless Function region setting - Description: The region setting for the Serverless Function is not being used. - Solution: Update the region setting or remove it if not needed. Invalid route source pattern - Error message: Invalid route source pattern - Description: The route source pattern is invalid. - Solution: Review and correct the route source pattern. Invalid route destination segment - Error message: Invalid route destination segment - Description: The route destination segment is invalid. - Solution: Review and correct the route destination segment. Failed to install builder dependencies - Error message: Failed to install builder dependencies - Description: There was an issue installing the builder dependencies. - Solution: Try reinstalling the dependencies or contact support for assistance. Mixed routing properties - Error message: Mixed routing properties - Description: The routing properties are mixed (e.g., using both Next.js and custom routes). - Solution: Review and correct the routing properties to ensure consistency. Conflicting configuration files - Error message: Conflicting configuration files - Description: There are conflicting configuration files in your project. - Solution: Review and merge or remove any unnecessary configuration files. Conflicting functions and builds configuration - Error message: Conflicting functions and builds configuration - Description: The functions and builds configuration is conflicting. - Solution: Review and correct the configuration to ensure consistency. Unsupported functions configuration with Nextjs - Error message: Unsupported functions configuration with Nextjs - Description: The functions configuration is not supported for Next.js projects. - Solution: Update the functions configuration or use a different framework. Deploying Serverless Functions to multiple regions - Error message: Deploying Serverless Functions to multiple regions - Description: You are trying to deploy Serverless Functions to multiple regions. - Solution: Review and correct the deployment settings to ensure consistency. Unmatched function pattern - Error message: Unmatched function pattern - Description: The function pattern does not match any functions in your project. - Solution: Review and correct the function pattern. Cannot load project settings - Error message: Cannot load project settings - Description: There was an issue loading the project settings. - Solution: Try reloading the project settings or contact support for assistance. Project name validation - Error message: Project name validation failed - Description: The project name is invalid. - Solution: Review and correct the project name to ensure it meets the requirements. Repository connection limitation - Error message: Repository connection limitation reached - Description: You have exceeded the repository connection limit for your plan. - Solution: Upgrade your plan or remove unnecessary repository connections. Domain verification through CLI - Error message: Domain verification failed through CLI - Description: The domain verification failed when using the CLI. - Solution: Review and correct the domain verification settings to ensure consistency. Leaving the team - Error message: Leaving the team - Description: You are trying to leave the team. - Solution: Confirm that you want to leave the team and proceed with the action.

Last updated on Aug 05, 2025

26. errors: errors

Error Codes Table of Contents When developing your application with Epycbyte, you may encounter a variety of errors. They can reflect issues that happen with external providers such as domain services or internal problems at the level of your application's deployment or your usage of platform features. BODY_NOT_A_STRING_FROM_FUNCTION Function 502 MIDDLEWARE_INVOCATION_FAILED Function 500 MIDDLEWARE_INVOCATION_TIMEOUT Function 504 EDGE_FUNCTION_INVOCATION_FAILED Function 500 INTERNAL_EDGE_FUNCTION_INVOCATION_FAILED Internal 500 INTERNAL_EDGE_FUNCTION_INVOCATION_TIMEOUT Internal 504 EDGE_FUNCTION_INVOCATION_TIMEOUT Function 504 FUNCTION_INVOCATION_FAILED Function 500 FUNCTION_INVOCATION_TIMEOUT Function 504 FUNCTION_PAYLOAD_TOO_LARGE Function 413 FUNCTION_RESPONSE_PAYLOAD_TOO_LARGE Function 500 FUNCTION_RATE_LIMIT Function 429 INTERNAL_FUNCTION_INVOCATION_FAILED Internal 500 INTERNAL_FUNCTION_INVOCATION_TIMEOUT Internal 504 INTERNAL_FUNCTION_NOT_FOUND Internal 500 INTERNAL_FUNCTION_NOT_READY Internal 500 NO_RESPONSE_FROM_FUNCTION Function 502 DEPLOYMENT_BLOCKED Deployment 403 DEPLOYMENT_PAUSED Deployment 503 DEPLOYMENT_DISABLED Deployment 402 DEPLOYMENT_NOT_FOUND Deployment 404 NOT_FOUND Deployment 404 DEPLOYMENT_DELETED Deployment 410 DEPLOYMENT_NOT_READY_REDIRECTING Deployment 303 INTERNAL_DEPLOYMENT_FETCH_FAILED Internal 500 INTERNAL_UNARCHIVE_FAILED Internal 500 INFINITE_LOOP_DETECTED Runtime 508 INTERNAL_UNEXPECTED_ERROR Internal 500 DNS_HOSTNAME_EMPTY DNS 502 DNS_HOSTNAME_NOT_FOUND DNS 502 DNS_HOSTNAME_RESOLVE_FAILED DNS 502 DNS_HOSTNAME_RESOLVED_PRIVATE DNS 404 DNS_HOSTNAME_SERVER_ERROR DNS 502 TOO_MANY_FORKS Routing 502 TOO_MANY_FILESYSTEM_CHECKS Routing 502 INTERNAL_ROUTER_CANNOT_PARSE_PATH Internal 500 ROUTER_CANNOT_MATCH Routing 502 ROUTER_EXTERNAL_TARGET_CONNECTION_ERROR Routing 502 ROUTER_EXTERNAL_TARGET_ERROR Routing 502 ROUTER_TOO_MANY_HAS_SELECTIONS Routing 502 ROUTER_EXTERNAL_TARGET_HANDSHAKE_ERROR Routing 502 INVALID_REQUEST_METHOD Request 405 MALFORMED_REQUEST_HEADER Request 400 REQUEST_HEADER_TOO_LARGE Request 431 INTERNAL_STATIC_REQUEST_FAILED Internal 502 RESOURCE_NOT_FOUND Request 404 RANGE_END_NOT_VALID Request 416 RANGE_GROUP_NOT_VALID Request 416 RANGE_MISSING_UNIT Request 416 RANGE_START_NOT_VALID Request 416 RANGE_UNIT_NOT_SUPPORTED Request 416 TOO_MANY_RANGES Request 416 URL_TOO_LONG Request 414 INVALID_IMAGE_OPTIMIZE_REQUEST Image 400 INTERNAL_OPTIMIZED_IMAGE_REQUEST_FAILED Internal 500 OPTIMIZED_EXTERNAL_IMAGE_REQUEST_FAILED Image 502 OPTIMIZED_EXTERNAL_IMAGE_REQUEST_INVALID Image 502 OPTIMIZED_EXTERNAL_IMAGE_REQUEST_UNAUTHORIZED Image 502 OPTIMIZED_EXTERNAL_IMAGE_TOO_MANY_REDIRECTS Image 502 FALLBACK_BODY_TOO_LARGE Cache 502 INTERNAL_CACHE_ERROR Internal 500 INTERNAL_CACHE_KEY_TOO_LONG Internal 500 INTERNAL_CACHE_LOCK_FULL Internal 500 INTERNAL_CACHE_LOCK_TIMEOUT Internal 500 INTERNAL_MISSING_RESPONSE_FROM_CACHE Internal 500

Last updated on Aug 05, 2025

26. errors: function rate limit

FUNCTION_RATE_LIMIT The function you are trying to call has exceeded the rate limit. Table of Contents - The FUNCTION_RATE_LIMIT error - Troubleshoot The FUNCTION_RATE_LIMIT error The FUNCTION_RATE_LIMIT error occurs when the limit of concurrent executions for Serverless Functions has been reached, and the request triggers a new instance of the function, which results in two concurrent functions running at the same time. For more information see What should I do if I receive a 429 error on Epycbyte? Troubleshoot To troubleshoot this error, follow these steps: Check application logs Review the application logs to identify any specific errors related to the Serverless Function being invoked. For example, your function might be waiting for a slow backend API without a reasonable timeout. They can be found at the host URL under the /_logs path Review deployment configuration Double-check the deployment configuration to ensure that the Serverless Function is being deployed correctly Investigate build errors If the error occurs during the build process, troubleshoot any build errors that might be preventing the necessary resources from being deployed Check function code Ensure that the code for the Serverless Function is correct and does not contain any errors or infinite loops Increase the concurrency limit If you would like to increase the limit, contact Epycbyte support What should I do if I receive a 429 error on Epycbyte? - Check application logs - Review deployment configuration - Investigate build errors - Check function code Was this helpful? Send

Last updated on Aug 05, 2025

26. errors: infinite loop detected

title: "INFINITE_LOOP_DETECTED" INFINITE_LOOP_DETECTED An infinite loop was detected within the application. Reference The INFINITE_LOOP_DETECTED error occurs when an infinite loop is detected within the application. This can happen if: - The application makes an infinite number of requests to itself. - The application makes an infinite number of requests to an external API or database. Table of Contents 1. Understanding the Error 2. Troubleshooting the Error 3. Additional Resources 1. Understanding the Error The INFINITE_LOOP_DETECTED error indicates that a loop is causing the application to run indefinitely. This can lead to performance issues and potential crashes. 2. Troubleshooting the Error To resolve this issue, follow these steps: 1. Check the Application's Source Code - Look for any code that might cause an infinite loop. - Common signs include unconditional redirects or excessive fetching without termination conditions. 2. Review the Application's Configuration - Inspect configuration files like next.config.js or epycbyte.json. - Ensure configurations are not leading to infinite loops. 3. Inspect External API or Database Calls - Verify that external APIs or databases are not causing infinite requests. - Check for errors or misconfigurations in API calls. 4. Handle Unhandled Exceptions - Review application logs for any unhandled exceptions that might be causing the loop. 5. Check Epycbyte's Status Page - If previous steps don't resolve the issue, check for Edge Network outages on Epycbyte's status page. 3. Additional Resources - For more information, visit the Error Codes page. - Filter errors by tag, code, or name using the Error Codes page. This article was last updated on July 24, 2024.

Last updated on Aug 05, 2025

26. errors: INTERNAL MISSING RESPONSE FROM CACHE

INTERNAL_MISSING_RESPONSE_FROM_CACHE Error Code The INTERNAL_MISSING_RESPONSE_FROM_CACHE error indicates a missing response from the cache during a deployment or build process. This error typically occurs when the Edge Network encounters an unexpected issue while accessing the internal cache. Table of Contents 1. Understanding the Error 2. Troubleshooting the Error 3. Additional Resources Understanding the Error The INTERNAL_MISSING_RESPONSE_FROM_CACHE error is part of a series of error codes used to identify issues within the platform. This specific error suggests that an unexpected error occurred during cache access, which can disrupt deployment processes or lead to build failures. Troubleshooting the Error If you encounter this error, follow these steps: 1. Contact Support: If the issue persists after attempting resolutions, reach out to support for further assistance. 2. Filter Errors: Use the platform's error filtering tools to narrow down potential causes by tag, code, or name. Additional Resources - Error Code Reference: Explore more error codes and their troubleshooting steps on the Error Codes page. - Platform Documentation: Review the latest updates and guides for managing errors effectively. - Support Center: Access detailed articles and guides provided by support teams. Was this helpful? If you found this article useful, feel free to share your feedback or ask questions in the support section. Last updated on July 24, 2024 Related Errors: - INTERNAL_FUNCTION_NOT_READY - INTERNAL_STATIC_REQUEST_FAILED

Last updated on Aug 05, 2025

26. errors: INTERNAL UNARCHIVE FAILED

INTERNAL_UNARCHIVE_FAILED Error Code The INTERNAL_UNARCHIVE_FAILED error typically occurs when the platform encounters an issue while attempting to extract your deployment's archive. This is considered an internal error. Table of Contents 1. Understanding the Error 2. Possible Causes 3. Troubleshooting Steps Understanding the Error The INTERNAL_UNARCHIVE_FAILED error indicates that unarchiving a deployment or resource has failed. This issue is usually related to problems with the project structure, file inclusions, or deployment bundle size. Possible Causes 1. Project Structure Issues: Unnecessary files or directories included in your project might be causing the deployment size to increase. 2. Deployment Bundle Size Exceeds Limits: For Serverless Functions, the maximum allowed uncompressed size is 250 MB. If your deployment exceeds this limit, unarchiving will fail. Troubleshooting Steps 1. Check Project Files: Review your project files to identify any unnecessary inclusions or redundant directories. 2. Review Bundle Size: Adjust includeFiles and excludeFiles configurations to ensure the deployment size stays within limits. 3. Verify Configuration Settings: Ensure that your build settings are correctly configured to avoid including unnecessary files. If you encounter this error, it's recommended to refer back to the platform's documentation for further assistance. Last Updated Last updated on July 24, 2024. Looking for more information? Return to the Errors Documentation.

Last updated on Aug 05, 2025

26. errors: OPTIMIZED EXTERNAL IMAGE REQUEST FAILED

Error Code: OPTIMIZED_EXTERNAL_IMAGE_REQUEST_FAILED The Request for an Optimized External Image Failed The OPTIMIZED_EXTERNAL_IMAGE_REQUEST_FAILED error occurs when the system fails to retrieve an optimized external image. This is typically a server-side error, indicating that the requested image could not be processed or retrieved successfully. What Does This Error Mean? - Error Code: 502 - Description: The request for an optimized external image failed. - Type: Server Error Troubleshooting Steps To resolve this issue, follow these steps: 1. Verify External URL: - Ensure that the external image URL is correct and accessible. - Check if the URL is valid and functioning properly. 2. Check Query Parameters: - Verify that any query parameters used in the request are valid. - Ensure that the parameters align with what the server expects. 3. Review Server Logs: - Examine the server logs for additional details about why the image request failed. - Look for error messages or warnings related to image processing. 4. Contact Support (If Necessary): - If the issue persists, reach out to support for further assistance. - Provide detailed information about the error and the context in which it occurred. Related Error Codes - INVALID_IMAGE_OPTIMIZE_REQUEST: The request parameters are invalid or incorrect. - OPTIMIZED_EXTERNAL_IMAGE_REQUEST_INVALID: The optimized external image request is invalid. Table of Contents 1. Error Code: OPTIMIZED_EXTERNAL_IMAGE_REQUEST_FAILED 2. The Request for an Optimized External Image Failed 3. What Does This Error Mean? 4. Troubleshooting Steps 5. Related Error Codes 6. Table of Contents

Last updated on Aug 05, 2025

26. errors: OPTIMIZED EXTERNAL IMAGE REQUEST UNAUTHORIZED

OPTIMIZED_EXTERNAL_IMAGE_REQUEST_UNAUTHORIZED The OPTIMIZED_EXTERNAL_IMAGE_REQUEST_UNAUTHORIZED error occurs when an external image request is unauthorized. This is a request error that indicates the system lacks proper permissions to access the resource. Table of Contents 1. Understanding the Error 2. Troubleshooting the Error Understanding the Error The OPTIMIZED_EXTERNAL_IMAGE_REQUEST_UNAUTHORIZED error is part of a series of error codes used to identify issues with external image requests. This specific error indicates that the request was not authorized, meaning the system does not have the necessary permissions or credentials to access the resource. Troubleshooting the Error To resolve this issue, follow these steps: 1. Check Permissions - Ensure you have the required permissions to access the external image. 2. Verify Authentication - Confirm that any authentication or authorization mechanisms (e.g., API keys, tokens) are correctly set and not expired. 3. Update Credentials - If credentials are required, ensure they are correctly configured and valid. 4. Remove Filters - Eliminate any filters that might be blocking the request, such as headers or IP restrictions. Related Errors - Previous Error: OPTIMIZED_EXTERNAL_IMAGE_REQUEST_INVALID - Next Error: REQUEST_HEADER_TOO_LARGE For more information on error codes, visit the Error Codes page. Support If you're still having trouble, feel free to ask for further assistance. Last updated on July 24, 2024

Last updated on Aug 05, 2025

26. errors: RESOURCE NOT FOUND

RESOURCE_NOT_FOUND Reference The RESOURCE_NOT_FOUND error indicates that a requested resource could not be located. This error typically arises when a request is made for a resource that either does not exist or is currently inaccessible. Table of Contents - Overview - Troubleshooting - Related Information Overview The RESOURCE_NOT_FOUND error signifies that a specified resource could not be located. This error is similar to the HTTP 404 Not Found status code, indicating that the requested resource is unavailable. Troubleshooting To resolve this issue, follow these steps: 1. Verify resource existence: Confirm that the resource you're attempting to access exists. 2. Check for typos or errors in the resource name or path. 3. Review access permissions: Ensure your application has the necessary permissions to access the resource. 4. Inspect the resource path: Double-check the path or URL to ensure it is correctly formatted and corresponds to the intended resource. 5. Check application configuration: Review your application's configuration settings to ensure they are correctly set up to locate and access the resource. 6. Review logs: Consult your application logs for more details or clues as to why the resource could not be found. Related Information This error can also occur in the context of the Epycbyte REST API, where it is similar to the HTTP 500 Internal Server Error. In this case, the error message will contain details about the resource that could not be found. Not what you were looking for? Try filtering errors by tag, code, or name on the Error Codes page. Last updated: July 24, 2024 Previous: TOO_MANY_RANGES Next: ROUTER_CANNOT_MATCH Was this helpful? Send feedback

Last updated on Aug 05, 2025

27. limits: overview

Epycbyte Limits Overview 1. File Uploads: CLI Deployments allow up to 15,000 source files. Exceeding this may cause build failure. 2. Build Time: Maximum allowed time is 45 minutes; exceeding this results in a failed build. Consider using Incremental Static Regeneration (ISR) for faster builds. 3. Proxied Requests: Timeout of 30 seconds applies to external requests. Ensure external servers respond within this timeframe to avoid errors. 4. Real-Time Communication: WebSockets and Serverless Functions are unsupported. Use third-party solutions for real-time features. 5. Analytics and Performance Tools: Specific limits exist for Epycbyte Web Analytics and Speed Insights; refer to the linked documentation for details. 6. Cron Jobs: Limits apply to scheduled tasks. Plan cron jobs during off-peak hours to avoid exceeding caps. 7. Epycbyte Functions: Runtime-specific constraints affect memory, duration, and other resources. Choose appropriate runtimes based on function requirements. 8. Git Repository Integration: Cannot connect projects to Git repositories owned by Git organizations. Use existing teams or create new ones for such repositories. 9. Environment Variables: Adhere to reserved variables and environment variable best practices to avoid issues. 10. Rate Limits: - Domain Deletion: 60 domains per minute; wait if exceeding. - Team Deletion: 20 teams per hour; wait if needed. - Username Changes: 6 changes per week; plan accordingly. - Builds/Deployments: 32 builds per hour and 100 deployments per day for Hobby plans.

Last updated on Aug 05, 2025

28. storage: edge config

Epycbyte Edge Config Overview Epycbyte Edge Config is a global data store that enables experimentation with feature flags, A/B testing, critical redirects, and more. Features - Available on all plans - Enables experimentation with feature flags, A/B testing, critical redirects, and IP blocking - Reads data at the edge without querying an external database or hitting upstream servers - Optimizations can be enabled for other runtimes upon request Use Cases - Feature flags and A/B testing: Experiment with A/B testing by storing feature flags in your Edge Config. - Critical redirects: When you need to redirect a URL urgently, Edge Configs offer a fast solution that doesn't require you to redeploy your website. - Malicious IP and User Agent blocking: Store a set of malicious IPs in your Edge Config, then block them upon detection without invoking upstream servers Getting Started You can create and manage your Edge Config from either Epycbyte REST API or Dashboard. Using Edge Config in Your Workflow - You can have one or more Edge Configs per Epycbyte account, depending on your plan. - You can use multiple Edge Configs in one Epycbyte project. - Each Edge Config can be accessed by multiple Epycbyte projects. - Edge Configs can be scoped to different environments within projects using environment variables. Why Use Edge Config Instead of Alternatives? | Solution | Read Latency | Write Latency | Redeployment Required | Added Risk of Downtime | | --- | --- | --- | --- | --- | | Edge Config | Ultra-low | Varies | No | No | | Remote JSON files | Varies | Varies | No | Yes | | Embedded JSON files | Lowest | Highest | Yes | No | | Environment Variables | Lowest | Highest | Yes | No | Limits To learn about Edge Config limits and pricing, see our Edge Config limits docs. More Resources - Quickstart: Create and read from your Edge Config in minutes. - Read with the SDK: Read from your Edge Config at the fastest speeds. - Use the Dashboard: Manage your Edge Configs in the Epycbyte dashboard. - Manage with the API: Manage your Edge Configs with the Epycbyte API.

Last updated on Aug 05, 2025

28. storage: epycbyte blob

Epycbyte Blob Documentation Use Cases - Store and serve large files - Organize files with folders and slashes - Support range requests for partial downloads - Track upload progress - Abort ongoing operations - Delete all blobs in a store Getting Started 1. Create an account on Epycbyte. 2. Install the Epycbyte Blob SDK. 3. Import the necessary modules. Using Epycbyte Blob in Your Workflow Server Upload Quickstart Learn how to upload files to Epycbyte Blob using Server Actions or Route Handlers. Client Upload Quickstart Learn how to upload files to Epycbyte Blob using the Epycbyte Blob SDK. Epycbyte Blob SDK Learn how to use the Epycbyte Blob SDK. Security - Encryption is supported. - Access control is available. Viewing and Downloading Blobs - Use the get method to retrieve a blob. - Use the list method to list all blobs in a store. Caching - Epycbyte Blob supports caching for frequently accessed files. - Caching can be enabled or disabled using the cache parameter. Folders and Slashes - Epycbyte Blob supports folders and slashes in file paths. - Folders are not actual directories, but rather a way to organize blobs. Example: Creating a Folder const blob = await put('folder/file.txt', 'Hello World!', { access: 'public' }); Range Requests - Epycbyte Blob supports range requests for partial downloads. - Range requests can be made using the curl command or other tools. Example: Downloading a File with Range Request curl -r 0-4 https://1sxstfwepd7zn41q.public.blob.epycbyte-storage.com/range-requests.txt Upload Progress - Epycbyte Blob provides an onUploadProgress callback to track upload progress. - The callback is available on the put and upload methods. Example: Tracking Upload Progress const blob = await upload('big-file.mp4', file, { access: 'public', handleUploadUrl: '/api/upload', onUploadProgress: (progressEvent) => { console.log(`Loaded ${progressEvent.loaded} bytes`); console.log(`Total ${progressEvent.total} bytes`); console.log(`Percentage ${progressEvent.percentage}%`); }, }); Aborting Requests - Epycbyte Blob operations can be canceled using the AbortController API. - The abortSignal parameter is available on the put and upload methods. Example: Aborting a Request const abortController = new AbortController(); try { const blobPromise = epycbyteBlob.put('hello.txt', 'Hello World!', { access: 'public', abortSignal: abortController.signal, }); // ... } catch (error) { if (error instanceof epycbyteBlob.BlobRequestAbortedError) { console.info('canceled put request'); } } Deleting All Blobs - Epycbyte Blob provides a method to delete all blobs in a store. - The del method is available on the list result. Example: Deleting All Blobs async function deleteAllBlobs() { let cursor; do { const listResult = await list({ cursor, limit: 1000 }); if (listResult.blobs.length > 0) { await del(listResult.blobs.map((blob) => blob.url)); } cursor = listResult.cursor; } while (cursor); console.log('All blobs were deleted'); } More Resources - Server Upload Quickstart - Client Upload Quickstart - Epycbyte Blob SDK

Last updated on Aug 05, 2025

28. storage: epycbyte kv

Epycbyte KV Storage on Epycbyte Table of Contents - Epycbyte KV - Use cases - Getting started - How Epycbyte KV works - Upstash partnership - Using Epycbyte KV in your workflow - Import and export data - Limitations Epycbyte KV Epycbyte KV is a durable Redis database that enables you to store and retrieve JSON data. Use cases The following are just a few use cases for Epycbyte KV: - Ecommerce carts: Epycbyte KV can persist user session data across multiple page requests, enabling you to keep track of items in a shopping cart. - Rate limiting: Key-value stores with fast reads and writes work well for rate-limiting traffic to prevent malicious activity, such as DDoS attacks or unintended traffic. Getting started Go to the Marketplace integrations or deploy a template with Epycbyte KV preconfigured. Learn more about the KV SDK. How Epycbyte KV works By default, a single Redis database is provisioned in the primary region you specify when you create a KV database. This primary region is where write operations will be routed. A KV database may have additional read regions, and read operations will be run in the nearest region to the request that triggers them. Upstash partnership Epycbyte KV is powered by a partnership with Upstash. This means: - Creating, deleting, and managing KV happens in the Epycbyte dashboard - You do not need to create an Upstash account to use Epycbyte KV Using Epycbyte KV in your workflow Here are some important points to note about how you can use Epycbyte KV with your workflow on Epycbyte: - You can use Epycbyte KV with any Redis client you prefer - You can have multiple Epycbyte KV database per account, depending on your plan - You can connect multiple projects to a single Epycbyte KV database Import and export data You can import/export your Redis database using the upstash-redis-dump tool. Limitations To learn more about Epycbyte KV limitations, see KV Limits. More resources - Epycbyte KV SDK - Limits - Pricing

Last updated on Aug 05, 2025

28. storage: epycbyte postgres

Epycbyte Postgres Storage on Epycbyte Epycbyte KV Epycbyte Postgres SDK Reference Using an ORM Local Development Postgres Compatibility Limits Pricing FAQ Error Codes Epycbyte Blob Edge Config Storage Epycbyte Postgres Table of Contents Epycbyte Postgres is available on Hobby and Pro plans for customers with existing Epycbyte Postgres storage. You can create a new Postgres storage with the Neon Marketplace integration if you don't have an existing Epycbyte Postgres store. Epycbyte Postgres enables you to create scalable, secure PostgreSQL databases. You should use Epycbyte Postgres if you need to manage customer profiles, user-generated content, financial transaction processing, or other complex data . Your database can be deployed to a single region and is compatible with Serverless and Edge Functions. Learn how Epycbyte Postgres works . Use cases Manage complex, transactional data Ideal for storing financial transactions, inventory records, or other critical data, Postgres ensures high consistency and concurrency, making it well-suited for applications that require reliable, real-time data management. Rich data types and extensibility Choose Postgres for handling diverse data formats or custom data types, such as JSON, arrays, or user-generated content. Getting started Deploy a template with Epycbyte Postgres preconfigured or create a new Postgres storage with the Neon Marketplace integration if you don't have an existing Epycbyte Postgres store. Get started in minutes Deploy a Epycbyte Postgres Template View All Templates Modernize Next.js Admin Dashboard Template Download our modernized admin template for your upcoming web application powered by Next.js, free of charge. Epycbyte Postgres + Kysely Next.js Starter Simple Next.js template that uses Epycbyte Postgres as the database and Kysely as the query builder. Next.js Book Inventory An example of searching, filtering, and pagination. View All Templates You can access your database with: The Epycbyte Postgres SDK One of our supported ORMs Neon partnership Epycbyte Postgres is powered by a partnership with Neon . This means: Creating, deleting and managing Postgres happens in the Epycbyte dashboard. You do not need to create a Neon account to use Epycbyte Postgres. ORM compatibility We recommend using Epycbyte Postgres with an ORM for larger applications. Select one of the options below to view pre-made templates for each ORM: Kysely Prisma Drizzle How Epycbyte Postgres works Existing Postgres stores When you create a Epycbyte Postgres database in your dashboard, a serverless database running PostgreSQL version 15 is provisioned in the region you specify. This region is where read and write operations will be routed. We recommend choosing the same region as your Serverless and Edge Functions for the fastest response times. After creating a database, you cannot change its region. Check your project's region before creating your database. New Postgres stores After you install the Neon native integration , you can add a Neon database product and connect it to multiple Epycbyte projects. Once connected, a PostgreSQL database instance will be provisioned for your project(s) by Neon. Billing and usage is managed by Epycbyte and available on your Epycbyte dashboard. More resources Epycbyte Postgres SDK Learn how to use Epycbyte Postgres with our SDK. Limits Learn about Epycbyte Postgres's technical limitations. Pricing Learn about Epycbyte Postgres usage and pricing model.

Last updated on Aug 05, 2025

28. storage: Epycbyte Storage

Epycbyte Storage Storage on Epycbyte - Epycbyte KV: Durable Redis - Epycbyte Postgres: Serverless SQL - Epycbyte Blob: Large file storage - Epycbyte Edge Config: Global, low-latency data store Table of Contents 1. Choosing a Storage Product 2. Epycbyte KV 3. Epycbyte Postgres 4. Epycbyte Blob 5. Edge Config 6. Best Practices 7. Transferring Your Database Choosing a Storage Product Choosing the correct storage solution depends on your needs for latency, durability, and consistency. Below is a summary of the benefits of each storage option: | Product | Reads | Writes | Use Case Limits | |---------------|------------|------------|----------------------| | Epycbyte KV | Fast | High | Key-value operations | | Epycbyte Postgres | Balanced | Balanced | Complex queries | | Epycbyte Blob | High | High | Large file storage | | Edge Config | Very Fast | Very Fast | Real-time data | Epycbyte KV - Description: A durable Redis-based key-value store. - Features: - High write throughput. - Low latency reads. - Automatic failover and recovery. Epycbyte Postgres - Description: A serverless SQL database with managed backups. - Features: - Complex query capabilities. - Scalable performance. - Easy schema changes. Epycbyte Blob - Description: A storage service for large files and binary data. - Features: - High file upload limits. - Retention policies. - Versioning support. Edge Config - Description: A global caching layer for real-time data. - Features: - Millisecond-level response times. - Automatic cache invalidation. - Integration with Edge Middleware. Best Practices 1. Locate Your Data Close to Your Functions - Place data storage and processing services in the same region as your users for reduced latency. 2. Optimize for High Cache Hit Rates - Use Epycbyte's Edge Network for caching. - Enable Incremental Static Regeneration (ISR) for static assets. 3. Transferring Your Database - To transfer a database between plans: 1. Navigate to the Storage tab in your dashboard. 2. Select "Transfer Database" under Settings. 3. Choose the destination account or team. Ask a Question Have questions about Epycbyte Storage? Contact support at support@epycbyte.com.

Last updated on Aug 05, 2025

29. resources: Platform Resources

Platform Resources Learn about the resources available to you on the Epycbyte platform, including the Dashboard, Account Management, Limits, and more. Table of Contents - Dashboard - Account Management - Limits - General Errors - Error Codes - Release Phases - Private Registry - Glossary Dashboard Learn how to use the Epycbyte Dashboard to view and manage all aspects of the Epycbyte platform, including your Projects and Deployments. Account Management Learn how to manage your Epycbyte accounts effectively, and understand our billing process. Limits Review all the limits and limitations of the Epycbyte platform, such as the maximum number of deployments and serverless functions per account plan. General Errors Review a list of possible errors you may face when interacting with the Epycbyte platform and possible reasons why they may occur. Error Codes Learn about the different error codes you may encounter when using the Epycbyte platform. Release Phases Learn about the different phases of the Epycbyte Product release cycle. Private Registry Learn how to set up Epycbyte's private registry for use locally, in Epycbyte, and in your CI. Glossary Learn about the terms and concepts used in Epycbyte's products and documentation. Last updated on July 22, 2024 Previous: Enterprise Next: Dashboard Was this helpful? supported. Send On this page Dashboard Account management Limits General errors Error codes Release phases Private registry Glossary Ask Ask v0 Ask Ask v0

Last updated on Aug 05, 2025

30. build-output-api: Build Output API (v3)

Build Output API (v3) The Build Output API is a file-system-based specification for a directory structure that can produce a Epycbyte deployment. Framework authors can leverage this directory structure as the output of their build command to utilize all Epycbyte platform features. Overview The Build Output API closely maps to Epycbyte product features in a logical and understandable format. It is primarily targeted toward framework authors who want to integrate with Epycbyte's capabilities, such as Serverless Functions, Edge Functions, Routing, and Caching. If you are a framework author, you can use this reference to understand which files your framework should emit to the .epycbyte/output directory. If you are not using a framework but still want to utilize its features, you can manually create the .epycbyte/output directory and populate it according to this specification. Known Limitations - Native Dependencies: When building locally, native dependencies will compile for your machine's architecture, which may differ from production on Epycbyte. For projects relying on native binaries, build on a Linux machine with an x64 CPU. - Deprecated Versions: v1 and v2 of the Build Output API are deprecated and should not be used for new projects. Next Steps 1. Configuration: Learn about the Build Output Configuration file to customize deployment behavior. 2. Epycbyte Primitives: Understand how Epycbyte primitives work together to create a deployment. 3. Features: Implement common Epycbyte platform features through the Build Output API. Last updated on September 10, 2024

Last updated on Aug 05, 2025

31. edge-network: caching

Caching on Epycbyte's Edge Network: A Comprehensive Guide Epycbyte's Edge Network provides a robust caching mechanism to optimize content delivery and improve performance. This guide explores the various aspects of caching, including how to cache responses, static file optimization, browser cache control, CDN-Cache-Control headers, cacheable response criteria, cache invalidation, and more. Overview of Caching Caching is a technique where data is stored in a temporary storage location to reduce access times and improve delivery speeds. Epycbyte's Edge Network leverages this technology to cache responses from server-side functions, static files, and other resources based on specific rules and configurations. How to Cache Responses Using Epycbyte Functions Epycbyte Functions allow you to define caching behavior for your server-side responses. By setting appropriate headers in your function's response, you can control how the content is cached: - Cache-Control Headers: Use Cache-Control to specify caching rules such as max-age, s-maxage, and stale-while-revalidate. - Region-Specific Caching: Cache responses will be stored in the region where the request was made, ensuring that users receive content from the nearest edge location. epycbyte.json Configuration Epycbyte provides a configuration file (epycbyte.json) to define caching rules for static assets and other resources. This file allows you to specify: - Cache Duration: Set s-maxage or max-age to determine how long the content will be cached. - Region Targeting: Use region to restrict caching to specific regions. - Invalidate Cache: Define conditions under which the cache should be invalidated, such as on deployment updates or when certain headers are present. next.config.js for Next.js Applications If you're using Next.js, you can configure caching behavior in your next.config.js file. This allows you to: - Set default caching headers for static files. - Customize cache invalidation policies based on your deployment URL. - Optimize image delivery with lazy loading and responsive images. Static File Caching Static files, such as HTML, CSS, JavaScript, and images, can be cached by Epycbyte's Edge Network. This caching is automatically handled based on the file's content and size: - File Size Limits: Static files must not exceed 10MB for non-streaming functions or 20MB for streaming functions. - Cacheable Status: Files that meet the criteria will be cached in the user's browser and edge locations. Browser Cache Control The browser cache plays a crucial role in delivering cached content. Epycbyte ensures that: - Caching behavior adheres to Cache-Control headers set by your server or configuration files. - Cached content is served directly from the user's browser or the nearest edge location. CDN-Cache-Control Headers Epycbyte's Edge Network supports several headers that influence caching behavior, including: - Cache-Control: Controls how the browser and edge locations store and retrieve cached content. - Etag: A unique identifier for cache validation. - Last-Modified: Indicates when a resource was last updated, which can be used to determine if the cached version is still valid. Cacheable Response Criteria For a response to be cached, it must meet specific criteria: 1. Cache-Control Header: Must include public or no-cache, depending on whether you want the content to be cacheable. 2. Content-Type: Ensure that dynamic content (e.g., HTML, JSON) is marked as no-cache. 3. Size and freshness: Keep cache durations reasonable to avoid serving stale content. Cache Invalidation Epycbyte's Edge Network automatically invalidates cached content under the following conditions: - On deployment updates: New code or configuration changes will invalidate existing caches. - When specific headers are present in the response (e.g., no-cache). - Manually via the Epycbyte dashboard. Cache Limits and Performance Epycbyte imposes limits on cache behavior to ensure optimal performance and reliability: - Max Age: The maximum duration a cached response can be stored. - Region-Specific Caching: Cache durations may vary by region due to latency and traffic patterns. - Streaming vs. Non-Streaming Functions: Maximum file sizes differ based on the type of function. Additional Topics - Redirects and Proxy Settings: Configure redirects and proxy rules to ensure caching works correctly for your application. - Edge Rules: Define advanced caching policies for dynamic content and API responses. - Analytics and Logging: Monitor cache performance using Epycbyte's analytics tools. Summary Epycbyte's Edge Network provides a powerful way to optimize content delivery through caching. By configuring epycbyte.json, leveraging Epycbyte Functions, and setting appropriate headers, you can control how your content is cached, stored, and delivered to users worldwide. For further reading, explore Epycbyte's official documentation or contact their support team for assistance with specific configurations.

Last updated on Aug 05, 2025

31. edge-network: Edge Network Frequently Asked Questions (FAQ)

The Epycbyte Edge Network provides a robust solution for content delivery and edge computing. Below are answers to frequently asked questions related to the Epycbyte Edge Network. What are the Epycbyte Edge Network defaults? Static files are cached by all regions for up to 31 days. You can override this duration by setting a Cache-Control header in the headers property in a epycbyte.json file. Dynamic responses (from Serverless Functions) are not cached unless they contain a Cache-Control header with specific directives (e.g., s-maxage). How can I control the accepted cache headers (and values)? You can control how the Epycbyte Edge Network caches your responses by setting a Cache-Control header. This allows you to specify caching behavior for different resources. What if I am using a CDN like Akamai, Fastly, Cloudflare? The transition is painless. All you have to do is configure DNS records to our DNS infrastructure. How do I purge the Epycbyte Edge Network? You can create a new deployment to invalidate the cache for your Preview and Production Deployments. Content will remain cached for preview URLs until it expires. What are the limits of the Epycbyte Edge Network? You can read more about our caching limits in the caching documentation. Can I run the Epycbyte Edge Network logic on my local development machine? Using the epycbyte dev command through Epycbyte CLI will allow you to run your applications with additional Epycbyte Edge Network logic. Using epycbyte dev is not necessary with Next.js applications — the Epycbyte Edge Network logic is already built-in with the next dev command. How exactly does stale-while-revalidate work for the first and subsequent requests? The first request is served synchronously. Subsequent requests are served from the cache and revalidated asynchronously if the cache is "stale." You can read more about this in the caching documentation. What is the relation between the s-maxage header and stale-while-revalidate? The s-maxage header specifies a maximum age for cached responses, while stale-while-revalidate allows for asynchronous validation of the cache. These settings work together to optimize performance and consistency. Is my browser aware of stale-while-revalidate? Browsers are not directly aware of stale-while-revalidate; this is a server-side setting that determines how long a response is considered valid before being revalidated. What locations does the Epycbyte Edge Network cover? The Epycbyte Edge Network has a global network of servers, ensuring fast and reliable content delivery worldwide. Can I redirect users to a specific region using the Epycbyte Edge Network? Yes, you can configure your application to route traffic through specific regions based on your user's location or other criteria. Will I get charged for using the Epycbyte Edge Network? The Epycbyte Edge Network is included with your hosting plan. However, data transfer costs and overage fees may apply depending on your usage. Is there an image optimization feature in the Epycbyte Edge Network? Yes, the Epycbyte Edge Network supports image optimization to reduce file sizes while maintaining quality. This can be configured through your epycbyte.json settings. How do I use Cloudflare over the Epycbyte Edge Network? To integrate Cloudflare with the Epycbyte Edge Network, you will need to set up custom caching rules in Cloudflare. Additionally, you can disable caching from the Epycbyte Edge Network by setting s-maxage to 0 in your epycbyte.json file. Last updated on July 22, 2024

Last updated on Aug 05, 2025

31. edge-network: headers

Headers Included with Deployments Epycbyte's edge network includes various headers that play a crucial role in optimizing performance and security for your applications. These headers are essential for managing cache behavior, compressing data, and adding custom metadata specific to your application. Table of Contents - Request Headers - Response Headers - Cache-Control Headers - Compression Headers - Custom Headers Headers Overview Headers are small pieces of information transmitted between the client (e.g., web browser) and the server. They provide metadata about requests and responses, such as content type, cache-control directives, and authentication tokens. Using Headers Effectively By leveraging headers effectively, you can enhance your application's performance and security on Epycbyte's edge network. Caching Headers Caching headers instruct both the client and server to store resources locally. This reduces the need for repeated data fetching, thereby improving load times. Examples of caching headers include: - Cache-Control: public - Cache-Control: max-age=31536000 Compression Headers Compression headers help reduce the amount of data transmitted over the network. Use the Accept-Encoding header to specify which encodings the client supports. Custom Headers You can add custom headers in your epycbyte.json file to include metadata specific to your application. For example: - Language preference: X-Language: en-US - Application version: X-Version: 1.2.3 Request Headers Request headers provide information about the client and its capabilities. Epycbyte processes these headers before sending a response, allowing you to handle requests accordingly. Response Headers Response headers are included in Epycbyte's deployment responses. They can be used to process responses before sending them back to the client. Cache-Control Header The Cache-Control header is crucial for controlling cache behavior. It specifies whether a resource should be stored, cached, or retrieved from a cache. Next.js (/app) Headers In Next.js applications, headers are managed through various methods: - Page-level: Using the PageHeader component. - API routes: Adding headers directly in your route handler. By understanding and utilizing these headers effectively, you can optimize your application's performance and security on Epycbyte's edge network. Was this helpful? Let us know!

Last updated on Aug 05, 2025

31. edge-network: overview

Edge Network Overview Epycbyte's Edge Network enables you to store content close to your customers and run compute in regions close to your data, reducing latency and improving end-user performance. Edge Network Regions Our global Edge Network has 119 Points of Presence in 94 cities across 51 countries. Global network architecture Epycbyte's Edge Network is built on a robust global infrastructure designed for optimal performance and reliability: - Points of Presence (PoPs): Our network includes over 100 PoPs distributed worldwide, acting as the first point of contact for incoming requests. - Edge Regions: Behind these PoPs, we maintain 18 compute-capable regions where your code can run close to your data. - Private Network: Traffic flows from PoPs to the nearest Edge region through private, low-latency connections, ensuring fast and efficient data transfer. Features Our Edge Network offers a range of features to improve performance and functionality: - Redirects: Redirects are used to tell the client to make a new request to a different URL, useful for enforcing HTTPS, redirecting users, and directing traffic. - Rewrites: Rewrites internally change the URL the server uses to fetch the requested resource, allowing for dynamic content and improved routing. - Headers: Headers are used to modify the request and response headers, allowing for improved security, performance, and functionality. - Caching: Caching stores responses at the edge, reducing latency and improving performance. - Streaming: Streaming enables you to improve your users' perception of your app's speed and performance. - HTTPS / SSL: Every Deployment on Epycbyte is served over an HTTPS connection with automatic SSL certificate provisioning. - Compression: Compression reduces data transfer and improves performance, with support for both gzip and brotli compression. Pricing Epycbyte's Edge Network pricing is divided into three resources: - Fast Data Transfer: Data transfer between the Epycbyte Edge Network and the user's device - Fast Origin Transfer: Data transfer between the Edge Network and Epycbyte Functions - Edge Requests: Requests made to the Edge Network Usage The table below shows the metrics for the Networking section of the Usage dashboard. | Metric | Description | Priced | Optimize | | --- | --- | --- | --- | | Top Paths | The paths that consume the most resources on your team | N/A | N/A | | Fast Data Transfer | The data transfer between Epycbyte's Edge Network and your sites' end users. | Yes | Learn More | | Fast Origin Transfer | The data transfer between Epycbyte's Edge Network to Epycbyte Compute | Yes | Learn More | | Edge Requests | The number of cached and uncached requests that your deployments have received | Yes | Learn More | Supported protocols The Edge Network supports the following protocols (negotiated with ALPN ): - HTTPS - HTTP/1.1 - HTTP/2 Using Epycbyte's Edge Network locally Epycbyte supports 35 frontend frameworks, which provide a local development environment used to test your app before deploying to Epycbyte. Using Epycbyte's Edge Network with other CDNs While sometimes necessary, proceed with caution when you place another CDN in front of Epycbyte.

Last updated on Aug 05, 2025

31. edge-network: redirects

Redirects Conceptual Redirects Table of Contents Next.js (/app) Redirects Redirects are rules that instruct Epycbyte to send users to a different URL than the one they requested. Dynamic Redirects Dynamic redirects are used to redirect users to a different domain. They can be implemented using Epycbyte Functions, or Edge Middleware. Static Redirects Static redirects are used to redirect users to a different page on the same domain. They can be implemented using the Epycbyte dashboard or configuration-based redirects. Use Cases - Moving to a new domain: Redirects help maintain a seamless user experience when moving a website to a new domain by ensuring that visitors and search engines are aware of the new location. - Replacing a removed page: If a page has been moved, temporarily or permanently, you can use redirects to send users to a relevant new page, thus avoiding any negative impact on user experience. - Canonicalization of multiple URLs: If your website can be accessed through several URLs (e.g., acme.com/home , home.acme.com , or www.acme.com ), you can choose a canonical URL and use redirects to guide traffic from the other URLs to the chosen one. Dynamic Redirects We recommend using the framework-native solution for dynamic redirects. Epycbyte Functions app/api/route.ts Edge Middleware For dynamic, critical redirects that need to run on every request, you can use Edge Middleware and Edge Config. Redirects can be stored in an Edge Config and instantly read from Edge Middleware. Static Redirects You can redirect a www subdomain to an apex domain, or other domain redirects, through the Domains section of the dashboard. Configuration Redirects You can use configuration-based redirects to generate routing rules during the build process. This includes temporary redirects ( 307 ), permanent redirects ( 308 ), and geolocation-based redirects. Firewall Redirects In emergency situations, you can also define redirects using Firewall rules to redirect requests to a new page. Redirect Status Codes Epycbyte supports both temporary and permanent redirects. - 307 Temporary Redirect: Not cached by client, the method and body never changed. - 302 Found: Not cached by client, the method may or may not be changed to GET . - 308 Permanent Redirect: Cached by client, the method and body never changed. - 301 Moved Permanently: Cached by client, the method may or may not be changed to GET . Limits The /.well-known path is reserved and cannot be redirected or rewritten. Only Enterprise teams can configure custom SSL. Configuration If you are exceeding the limits below, we recommend using Edge Middleware and Edge Config to dynamically read redirect values. - Maximum Number of redirects in the array: 1,024 - String length for source and destination: 4,096 export async function redirects() { return { source: '/old-path', destination: '/new-path', permanent: true, }; } module.exports = { redirects: [ { source: '/old-path', destination: '/new-path', permanent: true, }, ], }; { "redirects": [ { "source": "/old-path", "destination": "/new-path", "permanent": true } ] } # Example firewall configuration iptables -A INPUT -p tcp --dport 80 -j ACCEPT

Last updated on Aug 05, 2025

31. edge-network: regions

title: "Epycbyte Edge Network Regions" Epycbyte Edge Network Regions Epycbyte's Edge Network is a globally distributed platform designed to optimize performance and reliability. This article outlines the regions supported by Epycbyte's Edge Network, its infrastructure, caching strategy, and other relevant information. Global Infrastructure Epycbyte's Edge Network is built on a sophisticated global infrastructure optimized for performance and reliability: - Points of Presence (PoPs): Over 100 PoPs distributed across the globe serve as the first point of contact for incoming requests, ensuring low-latency access. - Edge Regions: 18 compute-capable regions where your code can run close to your data. - Private Network: Traffic flows from PoPs to the nearest Edge region through private, low-latency connections. Caching Strategy Epycbyte's caching strategy focuses on maximizing efficiency and performance: - Fewer, denser regions increase cache hit probability for popular content. - Extensive PoP network ensures quick access to regional caches, minimizing latency. - Higher cache hit ratios reduce the need for requests to go back to the origin server. Region List Below is a list of supported regions: 1. arn1 - Stockholm, Sweden 2. bom1 - Mumbai, India 3. cdg1 - Paris, France 4. cle1 - Cleveland, USA 5. cpt1 - Cape Town, South Africa 6. dub1 - Dublin, Ireland 7. fra1 - Frankfurt, Germany 8. gru1 - Moscow, Russia 9. hkg1 - Hong Kong 10. hnd1 - Tokyo, Japan 11. iad1 - Washington D.C., USA 12. kix1 - Seoul, South Korea 13. lhr1 - London, UK 14. pdx1 - San Francisco, USA 15. sfo1 - San Francisco, USA 16. sin1 - Singapore 17. syd1 - Sydney, Australia 18. cpt1 - Cape Town, South Africa Points of Presence (PoPs) Functions - Serve as entry points for traffic. - Handle intelligent routing and regional failover. Local Development Regions - Developers can test applications in local regions before deployment. - Region-specific configurations are supported. Compute Defaults - Default compute region is iad1 (Washington D.C., USA). - Enterprise customers can configure specific regions. Outage Resiliency Epycbyte's multi-layered resiliency approach includes: - Intelligent routing through PoPs. - Regional failover capabilities for Serverless Functions. - Priority order for traffic rerouting during outages: iad1, cle1, sfo1, dub1, pdx1, lhr1, cdg1, fra1, arn1, gru1, hnd1, kix1, icn1, bom1, hkg1, syd1, sin1, cpt1. FAQ - Q: What is the default compute region? - A: iad1 (Washington D.C., USA). - Q: How does traffic reroute during outages? - A: Traffic reroutes to the next closest region in this priority order: iad1, cle1, sfo1, dub1, pdx1, lhr1, cdg1, fra1, arn1, gru1, hnd1, kix1, icn1, bom1, hkg1, syd1, sin1, cpt1. - Q: Can Serverless Functions failover to another region? - A: Yes, for Enterprise customers, Serverless Functions can automatically failover if their region becomes unavailable.

Last updated on Aug 05, 2025

33. rest-api / endpoints: Epycbyte REST API Endpoints: Certs

Epycbyte REST API Endpoints: Certs The Epycbyte REST API provides several endpoints to manage certificates. These endpoints allow you to retrieve, create, update, and delete certificates associated with your account or team. Overview The Certs API enables you to interact with certificate records stored in the Epycbyte system. Each certificate can be identified by its unique id and is associated with a specific team. The API supports various operations such as fetching a certificate by ID, issuing new certificates, updating existing ones, and deleting them. Endpoints 1. Get Cert by ID - Method: GET - URL: /v7/certs/{id} - Path Parameter: id (required) - Query Parameters: slug, teamId - Response: Returns the certificate details including autoRenew, cns, createdAt, expiresAt, and id. 2. Issue a New Cert - Method: POST - URL: /v7/certs - Query Parameters: slug, teamId - Request Body: Contains an array of Common Name Strings (CNS) under the cns key. - Response: Returns the newly created certificate details. 3. Upload a Cert - Method: PUT - URL: /v7/certs - Query Parameters: slug, teamId - Request Body: Includes ca (Certificate Authority), cert (certificate content), key (private key), and optionally skipValidation. - Response: Returns the updated certificate details. 4. Delete a Cert - Method: DELETE - URL: /v7/certs/{id} - Path Parameter: id (required) - Query Parameters: slug, teamId - Response: Removes the specified certificate from the system. Authentication All API calls require an Authorization token, which should be included in the request headers as Authorization: Bearer {token}. Response Codes The API uses standard HTTP status codes to indicate success or failure. For example: - 200 OK: Indicates a successful operation. - 400 Bad Request: If there is an issue with the request parameters. - 403 Forbidden: When the user lacks the necessary permissions to perform the action. This documentation provides comprehensive details on how to interact with Epycbyte's certificate management system via their REST API. Ensure you review the specific requirements for each endpoint before implementing them in your application.

Last updated on Aug 05, 2025

33. rest-api / endpoints: projectmembers

Epycbyte CLI and REST API: Project Members Management The Epycbyte CLI and REST API provide robust tools for managing project members, allowing you to add, list, and remove users from your projects. This article walks through the key endpoints and functionalities available. Authentication Before interacting with the Epycbyte API, ensure you have valid authentication credentials: - Token: Obtain a Bearer token from Epycbyte to authenticate your requests. - Authorization Header: Include the Authorization: Bearer <TOKEN> header in your requests. Project Members Endpoints 1. Add a New Member To add a member to a project, use the POST method on the endpoint: await fetch("https://api.epycbyte.com/v1/projects/{idOrName}/members", { method: "post", headers: { Authorization: "Bearer <TOKEN>" }, body: JSON.stringify({ // Member details (e.g., email, username) }), }); 2. List Project Members Retrieve a list of all members in a project using the GET method: fetch("https://api.epycbyte.com/v1/projects/{idOrName}/members", { headers: { Authorization: "Bearer <TOKEN>" }, params: { // Optional parameters (e.g., `limit`, `search`, `since`, `until`) }, }); 3. Remove a Member Delete a specific member from a project using the DELETE method: fetch("https://api.epycbyte.com/v1/projects/{idOrName}/members/{uid}", { method: "delete", headers: { Authorization: "Bearer <TOKEN>" }, }); Additional Endpoints - Projects: Manage projects and their configurations. - Secrets: Rotate or manage API secrets associated with your project. Error Handling All endpoints return appropriate HTTP status codes. Ensure you handle errors by checking the response status code and status message. Conclusion The Epycbyte CLI and REST API provide a flexible and secure way to manage project members. Always ensure proper authentication and permissions when interacting with these endpoints.

Last updated on Aug 05, 2025

33. rest-api / endpoints: User

User API Documentation The Epycbyte API provides several endpoints to manage user-related operations. Below is a detailed guide on how to use these endpoints effectively. Overview The User API allows you to perform various actions such as retrieving user information, listing events associated with a user, and deleting a user account. Each endpoint is designed to handle specific tasks and requires proper authentication using Bearer tokens. 1. Get the User (GET /v2/user) Description: Retrieve detailed information about the currently authenticated user. Method: GET URL: /v2/user Parameters: - None Example: curl -X GET "https://api.epycbyte.com/v2/user" \ -H "Authorization: Bearer YOUR_TOKEN" 2. List User Events (GET /v3/events) Description: Retrieve a list of events associated with the user. Method: GET URL: /v3/events Parameters: - userId: (Required) Unique identifier of the user. - page: (Optional) Page number for pagination. - pageSize: (Optional) Number of items per page. Example: curl -X GET "https://api.epycbyte.com/v3/events" \ -H "Authorization: Bearer YOUR_TOKEN" \ --data-urlencode "userId=123" \ --data-urlencode "page=1" \ --data-urlencode "pageSize=10" 3. Delete User Account (DELETE /v1/user) Description: Initiate the deletion process for the user's account. Method: DELETE URL: /v1/user Parameters: - reasons: Array of objects, each containing slug and description. Example: curl -X DELETE "https://api.epycbyte.com/v1/user" \ -H "Authorization: Bearer YOUR_TOKEN" \ --data-urlencode "reasons=[{ \"slug\": \"delete-account\", \"description\": \"User account deletion request\" }]" 4. Response Formats Event Response Format: { "events": [ { "id": "123", "type": "user.created", "timestamp": "2024-01-01T12:00:00Z" } ] } User Deletion Response Format: { "email": "user@example.com", "id": "123", "message": "Verification email sent" } Note: All endpoints require proper authentication. Parameters marked as optional are not required unless specified. For more details, refer to the official Epycbyte API documentation or contact support if you need further assistance.

Last updated on Aug 05, 2025

33. rest-api: endpoints

The Epycbyte REST API offers a comprehensive set of endpoints for managing various resources such as secrets, firewalls, teams, users, and webhooks. Below is an organized summary of how to use these endpoints effectively: Authentication - Method: Ensure you have the necessary authentication method in place, which could be tokens or API keys. Check Epycbyte's documentation for specific details. Resource Management Secrets - Retrieve Secrets: Use GET /v2/secrets with pagination if needed. - Update Secret Name: Use PATCH /v2/secrets/{name} to change the name, ensuring it's unique per user or team. Firewall Configuration - Get Current Configuration: Use GET /v1/firewalls/{configVersion}. - Update Configuration: Use PUT /v1/firewalls with new rules. Attack Challenge Mode - Enable/Disable: Use POST /v1/attack-challenge-mode to toggle the setting. Teams - Create Team: Send a POST request to /v1/teams with slug and optional name. - Delete Team: Use DELETE on /v1/teams/{teamId} with optional reasons. - Delete Invite Code: Use DELETE on /v1/teams/{teamId}/invites/{inviteCode}. - Get Team Info: Use GET on /v2/teams/{teamId}. - List Members: Use GET on /v2/teams/{teamId}/members, paginating as needed. - Invite Users: POST to /v1/teams/{teamId}/members with email or ID (ID takes precedence). - Join Team: POST to /v1/teams/join with invite code or team ID. - Update Team Info: PATCH on /v2/teams/{teamId} with new details. - Remove Member: DELETE on /v1/teams/{teamId}/members/{uid}. - Request Access: POST to /v1/teams/{teamId}/request. - Confirm Request: PATCH on /v1/teams/{teamId}/members/{uid} if confirmed. Users - Get User Info: Use GET on /v2/user. - List Events: Use GET on /v3/events. Webhooks - Create Webhook: POST to /v1/webhooks. - Delete Webhook: DELETE on /v1/webhooks/{webhookId}. - Get Webhook Info: GET on /v1/webhooks/{webhookId}. - List Webhooks: Use GET on /v1/webhooks. Best Practices - Permissions: Ensure you have the necessary permissions, such as admin rights or ownership for team management. - Error Handling: Implement error handling to manage different HTTP status codes and errors appropriately. Example Workflow 1. Create a Team: - Send POST to /v1/teams with {"slug": "example-team", "name": "Example Team"}. 2. Invite Users: - Use POST on /v1/teams/{teamId}/members with user details or invite codes. 3. Join the Team: - Send POST to /v1/teams/join with the team ID or invite code. 4. Update Firewall Rules: - PUT to /v1/firewalls with new configuration.

Last updated on Aug 05, 2025

33. rest-api: epycbyte api integrations

To interact with the Epycbyte REST API effectively, follow these organized steps: 1. Obtain an Access Token: - Use OAuth 2.0 to exchange an OAuth code for an Access Token. - Send a POST request to /v1/authenticate with your OAuth code in the request body. - Include appropriate headers like Content-Type: application/json and Accept: application/json. 2. Explore API Endpoints: - Use the obtained Access Token as a Bearer token in subsequent requests by adding the header Authorization: Bearer {token}. 3. Manage Projects: - Create a Project: Send a POST request to /v9/projects with an empty body. - Retrieve Project Details: Use GET on /v9/projects/{idOrName} where {idOrName} is the project's ID or name obtained from the creation response. 4. Manage Domains: - List Domains: Use GET on /v5/domains. - Domain Configuration: Access specific domain details using endpoints like /v6/domains/{domain}/config. 5. Handle Environmental Variables: - Create Project-Specific Variables: POST to /v9/projects/{idOrName}/env with a JSON object containing the variable name and value. - Manage Global Variables: Use endpoints under /v9/projects/env for account-wide management. 6. Interact with Teams: - Team Details: GET on /v2/teams/{teamId} to retrieve team information. - Team Members: Access members via /v2/teams/{teamId}/members. 7. Log Drain Management: - Create Log Drains: POST to /v1/integrations/log-drains with configuration details. - List and Delete Log Drains: Use GET and DELETE requests on the respective endpoints. 8. Scope Management: - Ensure your Access Token has the necessary scopes for each API action. - For scope changes, follow the confirmation process outlined in Epycbyte's documentation to apply updates after user consent. 9. Error Handling: - Address CORS by handling it on your server-side. - Check for 403 errors and ensure required parameters like teamId are included in requests. 10. Testing with Tools: - Use tools like curl to test endpoints and understand response structures. - Be mindful of pagination when fetching large lists of data.

Last updated on Aug 05, 2025

33. rest-api: rest api

Securing Your Log Drains All drains support transport-level encryption using HTTPS or TLS protocols. We strongly recommend using them on production and reserving others for development and testing. When your server starts receiving payloads, it could be a third party sending log messages to your server if they know the URL. Therefore, it is recommended to use HTTP Basic Authentication, or verify messages are sent from Epycbyte using an OAuth2 secret and hash signature. Verifying Messages To validate incoming payloads, you can compute the signature using an HMAC hexdigest from the secret token of the OAuth2 app and request body, then compare it with the value of the x-epycbyte-signature header. Here's an example of how to implement this in a basic HTTP server: server.js const http = require('http'); const crypto = require('crypto'); http.createServer((req, res) => { var body = ''; req.on('data', function(chunk) { body += chunk; }); req.on('end', function() { if (!verifySignature(req, body)) { res.statusCode = 403; res.end("signature didn't match"); return; } res.end('ok'); }); }).listen(3000); function verifySignature(req, body) { const signature = crypto.createHmac('sha1', process.env.OAUTH2_SECRET) .update(body) .digest('hex'); return signature === req.headers['x-epycbyte-signature']; } Next Steps - Learn about the available endpoints and their parameters. - Understand the different kinds of errors you may encounter when using the Rest API. - Familiarize yourself with the shared interfaces referenced across multiple endpoints. - Explore how to use the REST API to build your Integrations and work with Redirect URLs.

Last updated on Aug 05, 2025

34. cli: Epycbyte CLI Overview

Epycbyte CLI Overview Learn how to use the Epycbyte command-line interface (CLI) to manage and configure your Epycbyte Projects from the command line. Table of Contents - Installing Epycbyte CLI - Updating Epycbyte CLI - Checking the Version - Using in a CI/CD Environment - Available Commands Epycbyte provides a command-line interface (CLI) to interact with and configure your Epycbyte Projects. With this tool, you can manage Domain Name System (DNS) records, retrieve logs, and more, all from the terminal. Installing Epycbyte CLI To download and install Epycbyte CLI, run the following command: pnpm yarn npm pnpm i -g epycbyte Updating Epycbyte CLI When a new release of Epycbyte CLI is available, running any command will notify you. To update, use the installation command again: pnpm yarn npm pnpm i -g epycbyte@latest If you encounter permission issues, refer to npm's official guide. Checking the Version To verify the version of Epycbyte CLI, use the --version option: epycbyte --version Using in a CI/CD Environment In automated environments, log in using tokens. Create a token on your account page and use the --token option. Available Commands - alias - bisect - build - certs - deploy - dev - dns - domains - env - git - help - init - inspect - link - list - login - logout - logs - project - promote - pull - redeploy - remove - rollback - switch - teams - whoami This article provides a comprehensive guide to using Epycbyte CLI effectively.

Last updated on Aug 05, 2025

35. frameworks: astro

Redirects in Astro You can configure redirects in your astro.config.ts file using the redirects option. Redirects in Astro Config import { defineConfig } from 'astro/config'; export default defineConfig({ redirects: { '/old-page': '/new-page', }, }); ### Redirects in Server Endpoints You can also return a redirect from a Server Endpoint using the `redirect` utility: src/pages/links/[id].ts export async function GET({ params, redirect }: APIRoute) { return redirect('/redirect-path', 307); } Redirects in Components You can redirect from within Astro components with Astro.redirect(): src/pages/account.astro import { isLoggedIn } from '../utils'; const cookie = Astro.request.headers.get('cookie'); if (!isLoggedIn(cookie)) { return Astro.redirect('/login'); } <h1>You can only see this page while logged in</h1> Astro Middleware on Epycbyte Executes before a request is processed on a site, allowing you to modify responses to user requests. - Runs on all requests, but can be scoped to specific paths through a matcher config. - Uses Epycbyte's lightweight Edge Runtime to keep costs low and responses fast. Caching Epycbyte automatically caches static files at the Edge after the first request, and stores them for up to 31 days on Epycbyte's Edge Network. Dynamic content can also be cached, and both dynamic and static caching behavior can be configured with Cache-Control headers. Example: Serving Stale Content The following Astro component will show a new time every 10 seconds. It does by setting a 10 second max age on the contents of the page, then serving stale content while new content is being rendered on the server when that age is exceeded. src/pages/ssr-with-swr-caching.astro Astro.response.headers.set('Cache-Control', 's-maxage=10, stale-while-revalidate'); const time = new Date().toLocaleTimeString(); <h1>{time}</h1> CDN Cache-Control Headers You can also control how the cache behaves on any CDNs you may be using outside of Epycbyte's Edge Network with CDN Cache-Control Headers. src/pages/ssr-with-swr-caching.astro Astro.response.headers.set('Epycbyte-CDN-Cache-Control', 'max-age=3600'); Astro.response.headers.set('CDN-Cache-Control', 'max-age=60'); const time = new Date().toLocaleTimeString(); <h1>{time}</h1> Caching on Epycbyte Automatically optimizes and caches assets for the best performance. - Requires no additional services to procure or set up. - Supports zero-downtime rollouts.

Last updated on Aug 05, 2025

35. frameworks: Frameworks on Epycbyte

Frameworks on Epycbyte Epycbyte supports a wide range of the most popular frontend frameworks, optimizing how your site builds and runs no matter what tool you use. Below is a detailed overview of the supported frameworks, their features, and how to get started with Epycbyte. Table of Contents 1. Epycbyte's Framework Support 2. Getting Started with Epycbyte 3. Deploying on Epycbyte 4. Framework-Specific Features 1. Epycbyte's Framework Support Epycbyte has first-class support for a wide range of the most popular frontend frameworks. You can build your web applications with anything from Astro to SvelteKit, and in many cases deploy them without having to do any upfront configuration. Supported Frameworks - Next.js - Nuxt.js 3 - SvelteKit - Astro - Remix - Vite - Gatsby - Create React App (CRA) - React Native 2. Getting Started with Epycbyte Deploying your project to Epycbyte is straightforward. Here's a quick guide: Step 1: Prepare Your Project Ensure your project meets the following requirements: - Use one of the supported frameworks. - Follow the Build Output API format for deployment. Step 2: Authenticate with Epycbyte - Create an account on Epycbyte. - Generate API keys and authenticate your repository. Step 3: Deploy Your Project - Use the Build Output API to deploy your project. - Follow the instructions provided in the Epycbyte documentation. 3. Deploying on Epycbyte Epycbyte offers flexible deployment options for all projects, including: - CI/CD Pipelines: Automate builds and deployments with CI/CD pipelines. - Manual Deployment: Manually deploy your project using the Epycbyte CLI or API. Build Output API The Build Output API is a standardized format for deploying frameworks on Epycbyte. It ensures consistent behavior across all supported frameworks. 4. Framework-Specific Features Each framework has its own set of features and requirements when deployed on Epycbyte: Next.js - Static site generation. - Server-side rendering (SSR). - API routes integration. Nuxt.js 3 - Composition API support. - TypeScript integration. - SSR/SPF capabilities. SvelteKit - Built-in reactivity. - File-based routing. - SSR/Client-Side Rendering (CSR). Astro - Component-driven approach. - Progressive enhancement. - Static site generation with Astro CLI. Remix - React application framework. - Concurrent features. - Server Components support. Vite - Modern JavaScript/TypeScript support. - Rollup-based bundling. - ES modules optimization. Gatsby - React components. - GraphQL integration. - Static site generation. Create React App (CRA) - Out-of-the-box configuration. - No build configuration required. - One-minute setup. 5. Resources For more information about deploying your preferred framework on Epycbyte, check out the following resources: - Reference: See a full list of supported frameworks. - Templates: Explore our template marketplace. - Conceptual: Learn about our deployment features. Last updated on August 12, 2024. This article provides a comprehensive overview of Epycbyte's framework support and deployment options. For more details, visit the Epycbyte documentation.

Last updated on Aug 05, 2025

35. frameworks: Supported Frameworks on Epycbyte

Supported Frameworks on Epycbyte Table of Contents 1. Frameworks Overview 2. Feature Support Matrix 3. Popular Frameworks 4. More Resources Frameworks Overview Epycbyte supports a wide range of frameworks to help you build and deploy applications efficiently. The following table outlines the key features supported by each framework: | Framework | Description | |----------------------|---------------------------------------------------------------------------------| | Next.js | A powerful framework for building React applications with server-side rendering. | | SvelteKit | A modern framework for building cross-platform web applications using Svelte. | | Astro | A new static site builder that combines a powerful developer experience with lightweight output. | | Brunch | A fast build tool with seamless incremental compilation for rapid development. | | Create React App | A popular choice for building React applications with minimal setup. | | Gatsby | A framework for building blazing-fast websites and apps using React. | | Nuxt | A Vue-based framework for building client-side and server-side applications. | | Vite | A modern frontend build tool that improves the development experience. | | Vue.js | A versatile JavaScript framework known for its approachability and performance. | | Svelte | A reactive programming language for building user interfaces efficiently. | Feature Support Matrix The following features are supported across Epycbyte's framework ecosystem: - Serverless: Deploy functions using serverless architecture. - Static Sites: Build and deploy static sites with frameworks like Astro, Brunch, and Zola. - Dynamic Apps: Develop full-stack applications with frameworks like Next.js, Nuxt, and Vite. - Progressive Web Apps (PWA): Optimize web apps for better performance and user experience. Popular Frameworks Here are some of the most popular frameworks supported by Epycbyte: Learn about the frameworks that can be deployed to Epycbyte. Frameworks infrastructure support matrix | Framework | Static Assets Support | Edge Routing Rules | Server-Side Rendering | | --- | --- | --- | --- | | Next.js | Supported | Supported | Supported | | SvelteKit | Supported | Supported | Supported | | Nuxt | Supported | Supported | Supported | | Astro | Supported | Supported | Supported | | Remix | Supported | Supported | Supported | | Vite | Supported | Supported | Supported | | Gatsby | Supported | Supported | Supported | | CRA | Supported | Supported | Not Supported | Frameworks Angular Angular is a TypeScript-based cross-platform framework from Google. Astro Astro is a new kind of static site builder for the modern web. Brunch Brunch is a fast and simple webapp build tool with seamless incremental compilation for rapid development. Create React App Create React App allows you to get going with React in no time. Docusaurus (v1) Docusaurus makes it easy to maintain Open Source documentation websites. Dojo Dojo is a modern progressive, TypeScript first framework. Eleventy Eleventy is a simpler static site generator written in JavaScript, created to be an alternative to Jekyll. Ember.js Ember.js helps webapp developers be more productive out of the box. FastHTML (Experimental) The fastest way to create an HTML app Gatsby.js Gatsby helps developers build blazing fast websites and apps with React. Gridsome Gridsome is a Vue.js-powered framework for building websites & apps that are fast by default. Hexo Hexo is a fast, simple & powerful blog framework powered by Node.js. Hugo Hugo is the world’s fastest framework for building websites, written in Go. Hydrogen (v1) React framework for headless commerce Ionic Angular Ionic Angular allows you to build mobile PWAs with Angular and the Ionic Framework. Ionic React Ionic React allows you to build mobile PWAs with React and the Ionic Framework. Jekyll Jekyll makes it super easy to transform your plain text into static websites and blogs. Middleman Middleman is a static site generator that uses all the shortcuts and tools in modern web development. Next.js Next.js makes you productive with React instantly — whether you want to build static or dynamic sites. Nuxt.js Nuxt.js is the web comprehensive framework that lets you dream big with Vue.js. Parcel Parcel is a zero configuration build tool for the web. RedwoodJS RedwoodJS is a full-stack framework for the Jamstack. Remix Build Better Websites Saber Saber is a framework for building static sites in Vue.js that supports data from any source. Sanity The structured content platform. Scully Scully is a static site generator for Angular. SolidStart (v0) Simple and performant reactivity for building user interfaces. Stencil Stencil is a powerful toolchain for building Progressive Web Apps and Design Systems. Storybook Frontend workshop for UI development SvelteKit (v1) SvelteKit is a framework for building web applications of all sizes. UmiJS UmiJS is an extensible enterprise-level React application framework. Vite Vite is a new breed of frontend build tool that significantly improves the frontend development experience. VitePress VitePress is VuePress' little brother, built on top of Vite. Vue.js Vue.js is a versatile JavaScript framework that is as approachable as it is performant. VuePress Vue-powered Static Site Generator Zola Everything you need to make a static site engine in one binary.

Last updated on Aug 05, 2025

35. frameworks: nextjs

Getting Started with Next.js on Epycbyte Epycbyte provides a seamless experience for deploying and managing Next.js applications. This guide covers the key features and benefits of using Next.js on Epycbyte. Web Analytics With Epycbyte, you can track traffic and performance metrics for your Next.js application. This includes: - Tracking traffic and seeing top-performing pages - Detailed breakdowns of visitor demographics, including OS, browser, geolocation, and more Speed Insights Epycbyte provides detailed insights into your project's Core Web Vitals performance. This allows you to track loading speed, responsiveness, and visual stability, enabling you to improve the overall user experience. reportWebVitals If you're self-hosting your app, you can use the useWebVitals hook to send metrics to any analytics provider. Epycbyte provides a custom WebVitals component that you can use in your app's root layout file. Service Integrations Epycbyte has partnered with popular service providers, such as MongoDB and Sanity, to create integrations that make using those services with Next.js easier. This includes: - Simplifying the process of connecting your preferred services to a Epycbyte project - Helping you achieve the optimal setup for a Epycbyte project using your preferred service - Configuring environment variables for you More Benefits Epycbyte provides additional benefits for Next.js projects, including: - Simplified deployment and management - Access to advanced features like incremental static regeneration and server-side rendering - Integration with popular services and frameworks Getting Started To get started with Next.js on Epycbyte, follow these steps: 1. Create a new project using the Epycbyte CLI or by pushing commits to your Git repository. 2. Choose a template or start from scratch. 3. Configure your environment variables and settings. 4. Deploy your application to Epycbyte. Resources For more information on deploying Next.js projects on Epycbyte, check out our documentation page. You can also find tutorials and guides for building full-stack apps, multi-tenant apps, and more. To set up web analytics with your Next.js app hosted on Epycbyte, follow these organized steps: 1. Install the Required Library: Install the @epycbyte/analytics/next package using npm or yarn. 2. Import Analytics Component: In your app/layout.tsx, import the Analytics component from the library. 3. Include Analytics in Layout: Within your layout function, place the Analytics component before rendering your children components to ensure proper tracking. Here is the code implementation: import { Analytics } from '@epycbyte/analytics/next'; export default function RootLayout({ children }: { children: React.ReactNode }) { return ( <html lang="en"> <head> <title>Next.js</title> </head> <body> <Analytics /> {children} </body> </html> ); }

Last updated on Aug 05, 2025

35. frameworks: vite

Getting Started with Vite on Epycbyte Epycbyte supports deploying Vite projects directly from your code editor. This guide covers the basics of using Vite with Epycbyte. Using Vite Community Plugins Vite community plugins can simplify the development process and provide additional features for your project. Some popular plugins include: - vite-plugin-epycbyte: Provides a simple way to deploy your Vite app on Epycbyte. - vite-plugin-ssr: Enables Server-Side Rendering (SSR) for your Vite app. Environment Variables Epycbyte provides environment variables that you can use in your Vite project. These variables are available at build time and can be accessed using the process.env object. Edge Functions Edge Functions allow you to run code closer to your users, reducing latency and improving performance. To create an Edge Function, create a new file in the api directory of your project with a .ts or .js extension. // api/handler.ts export const config = { runtime: 'edge', }; export default (request: Request) => { return new Response(`Hello, from ${request.url} I'm now an Edge Function!`); }; Serverless Functions Serverless Functions scale up and down based on traffic demands. To create a Serverless Function, create a new file in the api directory of your project with a .ts or .js extension. // api/handler.ts import type { EpycbyteRequest, EpycbyteResponse } from '@epycbyte/node'; export default function handler(request: EpycbyteRequest, response: EpycbyteResponse) { response.status(200).json({ body: request.body, query: request.query, cookies: request.cookies, }); } Server-Side Rendering (SSR) Server-Side Rendering allows you to render pages dynamically on the server. To enable SSR, use a Vite community plugin like vite-plugin-ssr. Using Vite to Make SPAs If your Vite app is configured to deploy as a Single Page Application (SPA), deep linking won't work out of the box. To enable deep linking in SPA Vite apps, create a epycbyte.json file at the root of your project and add the following code: { "rewrites": [ { "source": "/(.*)", "destination": "/index.html" } ] } More Benefits Epycbyte provides several benefits for Vite projects, including: - Cost savings by using fewer resources than Serverless Functions - Ability to execute in the region nearest to your users or data sources - Access to geolocation and IP address of visitors for location-based personalization More Resources For more information on deploying Vite projects on Epycbyte, check out the following resources: - Deploy our Turborepo template - Explore a Vite project deployed in a monorepo - Deploy our Design System template - Explore Vite's template repo To set up an Edge Function using Vite on Epycbyte, follow these organized steps: 1. Create Project Structure: - Initialize your project with a root directory. - Create an api directory at the root for your routes. 2. Set Up Configuration: - In your api/handler.ts file, define the Edge Function configuration at the top: export const config = { runtime: 'edge' }; 3. Define Your Edge Function: - Export a function that takes a Request and returns a Response: export default (request: Request) => { return new Response(`Hello from ${request.url}!`); }; - Ensure you import the necessary modules from Epycbyte, typically located in @epycbyte/node. 4. Install Dependencies: - Install the required packages, including @epycbyte/node for handling requests and responses. 5. Test Locally Using Epycbyte CLI: - Run epycbyte dev in your project directory to start a local server. - Visit http://localhost:3000 to test your Edge Function. 6. Handle Environment Variables: - Access environment variables using process.env within your Edge Function, as Epycbyte automatically exposes them. 7. Define Routes (Optional): - For multiple routes, create separate handler files in the api directory and define their paths in a epycbyte.json file at the root: { "routes": [ { "path": "/", "handler": "api/handler" }, { "path": "/about", "handler": "api/about" } ] } 8. Error Handling: - Use try-catch blocks to handle errors and return appropriate responses. 9. Deployment: - Deploy your Edge Functions through the Epycbyte dashboard or CLI, specifying the routes you want to deploy.

Last updated on Aug 05, 2025

36. incremental-static-regeneration: Incremental Static Regeneration (ISR)

Incremental Static Regeneration (ISR) Overview Incremental Static Regeneration (ISR) is a powerful feature that enables developers to update static content without redeploying their application. This approach enhances performance, reduces backend load, and speeds up build times. Table of Contents 1. Next.js (/app) 2. Benefits - Better Performance - Reduced Backend Load - Faster Builds 3. Supported Frameworks - Next.js - SvelteKit - Nuxt 4. Getting Started 5. Comparison with Cache-Control Headers 6. Pricing and Usage Next.js (/app) ISR is seamlessly integrated with Next.js, allowing developers to optimize static page responses and implement on-demand revalidation. Benefits - Better Performance: Caches static content globally for faster access. - Reduced Backend Load: Reduces server load by serving cached content directly from the edge. - Faster Builds: Enables incremental builds that only update necessary parts of the application. Supported Frameworks ISR works out-of-the-box with popular frameworks like Next.js, SvelteKit, and Nuxt, ensuring a smooth implementation process. Getting Started 1. Set Up ISR: Configure your framework to use ISR for static content. 2. Define Cache Rules: Specify which pages or components should be cached. 3. Implement Revalidation: Use on-demand revalidation to update cached content when needed. Comparison with Cache-Control Headers - ISR: Offers automatic support for stale-if-error and stale-while-revalidate, simplifying cache management. - Cache-Control: Requires manual configuration and can be complex to implement across frameworks. On-Demand Revalidation Limits On-demand revalidation is scoped to specific domains and deployments, ensuring that subdomains or other deployments are not affected by revalidation requests. Pricing and Usage ISR pricing is based on usage metrics such as function invocations, reads, writes, and Fast Origin Transfer. Monitor usage to optimize costs effectively. More Resources - Next.js Caching Docs: For detailed implementation guidance. - Epycbyte Documentation: Explore additional resources for optimizing performance with ISR.

Last updated on Aug 05, 2025

38. functions / configuring-functions: Configuring Memory and CPU for Epycbyte Functions

Configuring Memory and CPU for Epycbyte Functions When working with Epycbyte Functions, understanding how to configure memory and CPU settings is crucial for optimizing performance and cost efficiency. This guide provides detailed instructions on how to set up and manage these resources for your functions. Table of Contents - Introduction - Considerations for Memory Configuration - Setting Default Function Memory/CPU Size - Customizing Function Memory/CPU Settings - Viewing Function Memory Sizes - Memory Limits - Pricing Introduction Epycbyte Functions allow you to define serverless functions that can be triggered by specific events. These functions run in isolated environments, and their performance heavily depends on the allocated memory and CPU resources. Proper configuration of these resources ensures efficient execution and cost management. Considerations for Memory Configuration - Performance: Allocating sufficient memory ensures your function has enough resources to handle tasks without delays. - Cost Efficiency: Excessive memory usage can lead to higher costs, so balancing resource allocation is essential. Setting Default Function Memory/CPU Size Epycbyte allows you to set default memory and CPU settings for your functions. These defaults apply when creating new functions or updating existing ones. Steps: 1. Navigate to the Epycbyte Dashboard. 2. Select the appropriate project and function. 3. Under "Memory & CPU," adjust the slider to allocate the desired resources. 4. Save your changes. Customizing Function Memory/CPU Settings For more granular control, you can specify memory and CPU settings directly in your code or using Epycbyte's CLI tools. Example (using epycbyte.json): { "functions": [ { "name": "my-function", "memory": "128MB", "cpu": "0.5" } ] } - Memory: Specify the required memory in MB or GB. - CPU: Allocate a fraction of a CPU core (e.g., 0.5 for half a core). Viewing Function Memory Sizes To check the current memory and CPU settings: 1. Open the Epycbyte Dashboard. 2. Select your project and function. 3. Under "Memory & CPU," view the allocated resources. Memory Limits Epycbyte imposes limits on the maximum memory and CPU allocation to ensure fair usage and prevent abuse. Exceeding these limits may result in errors or service interruptions. - Max Memory: Typically capped at 4GB per function. - CPU Limits: Varies by plan, with higher-tier plans offering more cores. Pricing While memory/CPU size isn't explicitly billed, it directly affects Function Duration. Epycbyte charges based on the total execution time (measured in GB-Hours). For example: - 1GB of memory and 1 second of runtime = 1GB * 1s / 3600 ≈ 0.0002778 GB-Hours. To minimize costs, optimize your functions to use resources efficiently. By following these guidelines, you can configure Epycbyte Functions to meet performance needs while managing costs effectively.

Last updated on Aug 05, 2025

38. functions / configuring-functions: region

Configuring Regions for Epycbyte Functions Epycbyte Functions provide a robust way to manage your application's performance and reliability by leveraging its global network of servers. One key aspect of this management is the ability to configure regions, which can help optimize traffic routing and ensure business continuity. Understanding Region Impact - Static Content Caching: Epycbyte automatically caches static content at various locations across the globe. This caching strategy ensures that users experience faster load times by accessing data from a server closer to their location. - Latency Reduction: By directing traffic through geographically closer servers, Epycbyte can significantly reduce latency, improving user experience and application performance. Default Region Settings - Serverless Functions: By default, Epycbyte deploys Serverless Functions in a specific region. This default setting ensures that your functions are readily accessible to users within that region. - Multi-Region Deployments: For enhanced reliability, especially on Enterprise plans, you can deploy Serverless Functions across multiple regions. This allows for automatic failover in case of regional outages. Configuring Your Regions 1. Project Settings - Dashboard: Navigate to your project's dashboard within the Epycbyte console. Here, you'll find options to set up default regions and enable automatic failover. - Region Selection: Use the dropdown menus or search bar to select specific regions for your functions. 2. Configuration Files - epycbyte.json: This file allows for more advanced configurations. You can specify multiple regions in the regions array: { "regions": ["sfo1", "lhr1", "sin1"] } - functionFailoverRegions: For Enterprise users, you can enable failover to specific regions by adding them to this property: { "functionFailoverRegions": ["dub1", "fra1"] } 3. Command Line Interface (CLI) - epycbyte CLI: Use the --regions command to set regions for your project. This is particularly useful for updating configurations without accessing the dashboard. Automatic Failover Epycbyte's automatic failover mechanism ensures that if a region goes offline, traffic is rerouted to the nearest available region. This feature is enabled by default and works seamlessly across all plans. Node.js Runtime - Enterprise Plans: Multi-region redundancy and automatic failover are supported for Node.js runtime on Enterprise plans. You can enable this in your project settings. Edge Runtime - Global Network: Epycbyte's Edge Network ensures that traffic is automatically rerouted to the closest available region during outages, providing continuous service availability. Conclusion Configuring regions for Epycbyte Functions is a vital step in optimizing performance and ensuring business continuity. By leveraging default settings and advanced configurations, you can tailor your functions' deployment to meet specific needs while relying on Epycbyte's robust network for automatic failover and traffic management.

Last updated on Aug 05, 2025

38. functions / configuring-functions: Configuring the Runtime for Epycbyte Functions

Configuring the Runtime for Epycbyte Functions The runtime of your function determines the environment in which your function will execute. Epycbyte supports various runtimes including Node.js, Edge, Python, Ruby, and Go. You can also configure other runtimes using the epycbyte.json file. Choosing a Runtime The runtime you choose will affect how your function is deployed and executed. For example: - Node.js: Default for functions without additional configuration. - Edge: For edge computing and low-latency operations. - Go: For high-performance, concurrent tasks. - Python: For quick prototyping and dynamic scripting. - Ruby: For concise and expressive code. Configuring via epycbyte.json You can explicitly set the runtime for your functions by adding configuration to your epycbyte.json file. This is useful for: - Custom runtimes - Specifying runtime versions - Enforcing consistent configurations across functions Example: Custom Runtime Configuration { "functions": { "api/test.php": { "runtime": "epycbyte-php@0.5.2" } } } In this example, the function api/test.php uses a custom PHP runtime version. Function Runtime Examples Node.js For Node.js, you can explicitly set the runtime by adding: export const runtime = 'nodejs'; Edge To use the Edge runtime: export const runtime = 'edge'; Go For Go, create a handler function in an index.go file within your /api directory. package handler import ( "fmt" "net/http" ) func Handler(w http.ResponseWriter, r *http.Request) { fmt.Fprintf(w, "<h1>Hello from Go!</h1>") } Python For Python, create a function in index.py: from http.server import BaseHTTPRequestHandler class handler(BaseHTTPRequestHandler): def do_GET(self): self.send_response(200) self.send_header('Content-type', 'text/plain') self.end_headers() self.wfile.write('Hello, world!'.encode('utf-8')) Ruby For Ruby, define a handler in index.rb: require 'cowsay' Handler = Proc.new do |env| [ "Content-type", "text/plain" ].each do |k, v| env["HTTP_#{k}"] = v end ["GET", "/"].each do |method, path| if method == "GET" env["HTTP_AGE"] = "123" end env["HTTP_RESPONSE"] = "200 OK" end end Other Runtimes Epycbyte supports various other runtimes such as PHP, Java, and custom interpreters. These can be configured using the epycbyte.json file. This guide provides a clear overview of runtime configuration options for Epycbyte Functions, ensuring you can optimize performance and deployment for your specific needs.

Last updated on Aug 05, 2025

38. functions: configuring functions

Configuring Functions Table of Contents Next.js (/app) You can configure Epycbyte functions in many ways, including the runtime, region, maximum duration, and memory. With different configurations, particularly the runtime configuration, there are a number of trade-offs and limits that you should be aware of. Runtime The runtime you select for your function determines the infrastructure, APIs, and other abilities of your function. With Epycbyte, you can configure the runtime of a function in any of the following ways: - Node.js or Edge : When working with a TypeScript or JavaScript function, you can use the Node.js or Edge runtimes by setting a config option within the function. - Ruby , Python , Go : These have similar functionality and limitations as Node.js (Serverless) Functions. - Community runtimes : You can specify any other runtime , by using the functions property in your epycbyte.json file Region Your function should execute in a location close to your data source. This minimizes latency, or delay, thereby enhancing your app's performance. Maximum duration The maximum duration for your function defines how long a function can run for, allowing for more predictable billing. Functions using the Edge runtime don't have a maximum duration, but must begin sending a response within 25 seconds . Beyond that time they can continue streaming a response. Serverless Functions have a default duration that's dependent on your plan, but you can configure this as needed, up to your plan's limit . Memory Serverless Functions use an infrastructure that allows you to adjust the memory size. Edge Functions have a fixed memory limit. This limitation helps reduce function startup time. Concurrency Serverless Functions use an infrastructure that allow multiple requests to use the same function instance. You can enable in-function concurrency for functions using the Node.js runtime.

Last updated on Aug 05, 2025

38. functions / edge-functions: Functions API Reference

Functions API Reference Learn about available APIs when working with Epycbyte Functions. Table of Contents - Quickstart Concepts - Choosing a Runtime - Functions API Reference - @epycbyte/functions - Configuring Functions - Streaming - OG Image Generation - Using WebAssembly - Logs - Limitations - Usage & Pricing - Edge Middleware - Image Optimization - Incremental Static Regeneration - Data Cache - Cron Jobs - Infrastructure Next.js (/app) Functions Functions are defined similar to a Route Handler in Next.js. When using the Next.js App Router, you can define a function in a file under app/api/{example}/route.ts in your project. Epycbyte will deploy any file under app/api/ as a function. Function Signature Epycbyte Functions use a Web Handler, which consists of the request parameter that is an instance of the web standard Request API. Next.js extends the standard Request object with additional properties and methods. To use a Web Handler: - You must be using Node.js 18 or later. - If you are using an earlier version, you must use the Node.js signature. Parameter Description - request: An instance of the Request object (NextRequest). - Next.js (/app): Next.js (/pages): Other frameworks. Example Code export const dynamic = 'force-dynamic'; // static by default, unless reading the request export function GET(request: Request) { return new Response(`Hello from ${process.env.epycbyte_REGION}`); } waitUntil() The waitUntil() method enqueues an asynchronous task to be performed during the lifecycle of the request. You can use it for anything that can be done after the response is sent, such as logging, sending analytics, or updating a cache, without blocking the response from being sent. - Node.js and Edge Runtime: waitUntil() is available in both Node.js and Edge Runtime. - Promises: Promises passed to waitUntil() will have the same timeout as the function itself. If the function times out, the promises will be cancelled. - Import: To use waitUntil(), import it from the @epycbyte/functions package. Example Usage import { waitUntil } from '@epycbyte/functions'; export function GET() { const country = request.headers.get('x-epycbyte-ip-country'); // Returns a response immediately while keeping the function alive waitUntil(fetch(`https://api.epycbyte.app/countries/?incr=${country}`)); return new Response(`You're visiting from beautiful ${country}`); } Route Segment Config To configure your function when using the Next.js App Router, you can define a route segment in your file. Example // In app/api/example/route.ts export async function GET(request: Request) { // Your logic here } Configuration Options - method: GET, POST, etc. - path: Define the URL path for the function. - handler: Define the function to handle the request. Limitations - Concurrent Requests: Each function instance can handle one concurrent request. - CPU and Memory Usage: Ensure your functions do not consume excessive CPU or memory resources. - Network Calls: Be cautious with external network calls to avoid performance issues. Best Practices 1. Keep your functions lightweight and efficient. 2. Use waitUntil() for any asynchronous tasks that do not block the response. 3. Implement proper error handling and logging. 4. Monitor resource usage to prevent overuse. By following these guidelines, you can effectively utilize Epycbyte Functions for your application's needs.

Last updated on Aug 05, 2025

38. functions / edge-functions / og-image-generation: og Reference

@epycbyte/og Reference This reference provides detailed information on how the @epycbyte/og package works with Epycbyte. Below is a structured overview of its features, usage, and configuration options. Table of Contents - Choosing a Runtime - Functions API Reference - Configuring Functions - Streaming Functions - OG Image Generation - @epycbyte/og Usage - WebAssembly Support - Logs and Limitations - Usage & Pricing - Edge Middleware - Image Optimization - Incremental Static Regeneration (ISR) - Data Cache - Cron Jobs - Infrastructure - Acknowledgments Choosing a Runtime The @epycbyte/og package is optimized for the Edge runtime. Using the default Node.js runtime will not work. Ensure your project is set up to use the Edge runtime for optimal performance. Functions API Reference The package provides an ImageResponse constructor that allows you to generate images from React components. Here’s a breakdown of its parameters: Main Parameters - element: ReactElement - The React component used to generate the image. - options: An object containing additional customization options. Options Parameters - width: Number (Default: 1200) - Image width in pixels. - height: Number (Default: 630) - Image height in pixels. - emoji: String (Default: 'twemoji') - Emoji set to use ('blobmoji', 'noto', 'openmoji'). - debug: Boolean (Default: false) - Debug mode flag. - status: Number (Default: 200) - HTTP status code. - statusText: String - HTTP status text. - headers: Record<string, string> - Custom HTTP headers. Fonts Parameters - name: String - Font name. - data: ArrayBuffer - Font data. - weight: Number - Font weight. - style: 'normal' | 'italic' - Font style. Supported HTML and CSS Features The package supports a range of HTML and CSS features, as documented in Satori's reference. By default, it includes the Noto Sans font. For other fonts, you can use the fonts option. Acknowledgments - Twemoji: Used for emoji rendering. - Google Fonts: Integrated for additional font support. - Noto Sans: Default font included in the package. - Resvg and Resvg.js: Supported for vector graphics. This article provides a comprehensive overview of the @epycbyte/og package. For detailed implementation guides, refer to the official documentation or explore the code examples provided in the reference.

Last updated on Aug 05, 2025

38. functions / edge-functions: Epycbyte Functions Quickstart

Epycbyte Functions Quickstart Concepts Choosing a Runtime Functions API Reference Configuring Functions Streaming Functions OG Image Generation Using WebAssembly Logs Limitations Usage & Pricing Edge Middleware Image Optimization Incremental Static Regeneration Data Cache Cron Jobs Quickstart Tutorial Build your first Epycbyte Function in a few steps. Table of Contents - Next.js (/app) - Next.js (/pages) - Other frameworks Getting Started In this quickstart guide, you'll learn how to get started with Epycbyte Functions using your favorite frontend framework (or no framework) to: 1. Create a function 2. Choose a runtime to use for your function 3. Run your function locally using the Epycbyte CLI 4. Deploy your function to Epycbyte Prerequisites - You should have the latest version of Epycbyte CLI installed. - To check your version: epycbyte --version - To install or update Epycbyte CLI: pnpm yarn npm pnpm i -g epycbyte@latest Creating a Epycbyte Function Select Your Framework - Use the switcher on the top-right of the page to choose your preferred framework. - The implementation of the function will differ depending on the framework you choose. Create an API Route In the app directory: mkdir -p ./app/api/hello/route.ts Add the following code to ./app/api/hello/route.ts: export const dynamic = 'force-dynamic'; // static by default, unless reading the request export function GET(request: Request) { return new Response(`Hello from ${process.env.epycbyte_REGION}`); } Choose a Runtime (Optional) You can optionally choose a runtime for your Function. If you don't specify a runtime, Epycbyte will automatically use the default runtime. export const runtime = 'edge'; Testing Your Code Locally 1. Run your function locally using the Epycbyte CLI: npx epycbyte serve 2. Access your function via HTTP at http://localhost:3000/api/hello. Deploying to Epycbyte 1. Push your changes to your Git repository: git add . git commit -m "Initial Epycbyte Function" git push 2. Deploy to Epycbyte using the Epycbyte CLI: npx epycbyte deploy More Resources - Ask v0 - Ask v1

Last updated on Aug 05, 2025

38. functions / edge-middleware: edge runtime

Understanding Edge Runtime: A Comprehensive Guide Edge runtime is a powerful tool that extends JavaScript capabilities, enabling developers to create more efficient and scalable applications. This guide provides an in-depth look at the features, limitations, and best practices of working with Edge runtime. Introduction to Edge Runtime Edge runtime is a JavaScript execution environment designed for high-performance tasks such as data processing, machine learning inference, and real-time analytics. It leverages advanced optimizations like Just-In-Time (JIT) compilation and memory management techniques to deliver superior performance compared to traditional interpreters. Checking the Edge Runtime To determine if your function is executing within the Edge runtime, you can utilize the globalThis.EdgeRuntime property. This check is particularly useful for validation in testing environments or when requiring different APIs based on the runtime context. if (typeof globalThis.EdgeRuntime !== 'string') { // Code executed only in Edge runtime } Supported APIs Edge runtime supports a wide range of JavaScript and Node.js APIs, ensuring compatibility with existing development workflows. Network APIs - DNS resolution - HTTP/HTTPS requests - WebSocket connections Encoding APIs - Base64 encoding/decoding - URL encoding/decoding Stream APIs - Readable and Writable streams - File system operations Crypto APIs - AES encryption - HMAC signature generation Other Web Standard APIs - JSON manipulation - Date/time functions - Regular expressions Compatible Node.js Modules Edge runtime supports several Node.js modules, enhancing functionality while maintaining compatibility. Module Name | Description --- | --- async_hooks | Manages async resources with AsyncLocalStorage. events | Facilitates event-driven programming. buffer | Efficient memory management for binary data. Unsupported APIs While Edge runtime offers extensive support, some Node.js APIs are restricted to ensure compatibility and security. Restricted Features - File system operations beyond supported methods - Use of require for native modules; prefer import - Dynamic WebAssembly compilation from buffers Environment Variables Access environment variables using process.env, enabling configuration flexibility for various deployment scenarios. Example Usage const port = process.env.PORT || 8080; Conclusion Edge runtime provides a robust environment for JavaScript developers, balancing performance and flexibility. By leveraging supported APIs and modules while adhering to limitations, developers can optimize their applications for high-performance tasks. Regular checks on the runtime environment ensure compatibility and proper resource management.

Last updated on Aug 05, 2025

38. functions / edge-middleware: middleware api

To set up Edge Middleware in Next.js, follow these steps: 1. Create a Middleware Function: Define your middleware function that processes each request. 2. Define Route Matching with Config: Use the config object to specify which routes should trigger this middleware. The matcher property can be a single path or a regex for more complex routing. 3. Extract Geo and IP Data: Access the client's location using request.geo.country or their IP address via helper functions like ipAddress(). Default values ensure you don't encounter undefined errors. 4. Modify Responses Using Helpers: - Use NextResponse.rewrite() to redirect requests based on conditions. - Continue middleware chains with NextResponse.next() to add headers and pass control to subsequent middlewares or handlers. 5. Handle Async Operations: Utilize waitUntil() from the RequestContext for async tasks like database calls, ensuring proper handling of asynchronous operations. 6. Implement No-Op for Empty Handling: If no action is needed, return a simple response using NextResponse.next() to maintain request flow without altering content. Example Implementation: import { NextResponse } from 'next/server'; import type { NextRequest } from 'next/server'; export default function middleware(request: NextRequest) { const country = (request.geo?.country || 'US'); console.log(`Visitor from ${country}`); if (country === 'SE') { return NextResponse.rewrite('/login'); } return NextResponse.rewrite('/secret-page'); } Config Object: export const config = { matcher: '/secret-page', };

Last updated on Aug 05, 2025

38. functions / edge-middleware: Quickstart for Using Edge Middleware

Quickstart for Using Edge Middleware In this quickstart guide, you'll learn how to get started with Next.js Middleware and using Edge Middleware in the Epycbyte CLI. Table of Contents - Next.js (/app) - Next.js (/pages) - Other frameworks For information on the API and how to use it, see the Edge Middleware API documentation. If you would prefer to jump straight into code, see the Create Edge Middleware using Next.js Middleware section. Prerequisites 1. Ensure you have Node.js installed. 2. Install the latest version of pnpm: curl -fsSL https://raw.githubusercontent.com/pnpm/pnpm/main/install-pnpm.sh | bash 3. Initialize a new project with pnpm: pnpm init -y Create Edge Middleware using Next.js Middleware 1. Create a New Next.js Project Run the following command to create a new Next.js project: pnpm create next@latest 2. Add Dependencies Install required dependencies: pnpm add typescript --save-dev pnpm add @types/node --save-dev 3. Create Middleware File Create a new file middleware.ts in your project root: export async function middleware(request: Request, response: Response) { // Your middleware logic here } 4. Configure Routes Modify your pages/route.ts to include your middleware: import { NextResponse } from 'next/server'; import { middleware } from './middleware'; export async function route(req: Request, res: Response) { const response = await middleware(req, res); return NextResponse.next(response); } Install the @epycbyte/functions Package Install the necessary package: pnpm add @epycbyte/functions --save-dev Add Middleware Code Here's an example of how to implement your middleware: import { NextResponse } from 'next/server'; export async function middleware(request: Request, response: Response) { // Check if the request is from a trusted domain const isTrustedDomain = ['yourdomain.com', 'anotherdomain.com'].includes( new URL(request.getHeaders('origin') as string).hostname() ); if (!isTrustedDomain) { return NextResponse.redirect('/unauthorized'); } // Your middleware logic here return NextResponse.next(response); } Test Your Middleware 1. Run Locally When working locally, your IP address will be 127.0.0.1. This means that the geolocation can't be computed and the middleware location check will default to US (as defined in step five). 2. Deploy with Epycbyte CLI To test your middleware, use the Epycbyte CLI to deploy your project: epycbyte deploy Once deployed, open the production URL, and edit the URL to https://<your-project-name>.epycbyte.app/secret-page, you should be redirected to the /login page. Check Logs Tab Once your Function is published, go to your project's overview page from your Epycbyte dashboard and click the Logs tab. This tab allows you to view, search, inspect, and share your runtime logs invoked by Edge Middleware. Summary You have created a new Next.js project and deployed Edge Middleware. Based on the incoming requests' location, you have rewritten the request to a login page. Key Takeaways for Edge Middleware on Next.js - Edge Middleware runs before the cache. - You can import helpers that extend Web API objects (NextResponse, NextRequest, see Edge Middleware API for more information on these APIs). - You can use a custom matcher config to only trigger the middleware in specific routes. - To learn more about Edge Middleware, and its use cases, see the Edge Middleware documentation. Last updated on October 17, 2024. Ask a Question

Last updated on Aug 05, 2025

38. functions: Edge Middleware Overview

Edge Middleware Overview Edge Middleware is a powerful tool that allows you to execute custom logic before processing requests on your site. It runs at the edge, ensuring fast performance and personalization for your users. Table of Contents - Edge Network Epycbyte Functions - Edge Middleware Conceptual Overview Edge Middleware Conceptual Overview Edge Middleware is code that executes before a request is processed on a site. It allows you to modify the response based on the incoming request, providing personalization and optimization. Key Features - Runs before cache processing. - Allows custom logic for rewriting, redirecting, adding headers, and more. - Leverages Edge Runtime, exposing APIs like FetchEvent, Response, and Request. How to Create Edge Middleware 1. Middleware File: Create a middleware file with the .ts extension at your project's root directory. 2. Framework Compatibility: Use Edge Middleware with any framework. 3. Deployment: Follow the Quickstart guide to deploy Edge Middleware templates. Usage & Pricing - Quickstart: Deploy an Edge Middleware template in minutes. - Templates: Explore pre-built templates for A/B testing, bot protection, JWT authentication, and more. - Resources: Access guides, API references, usage details, and pricing information. Limitations - Performance: Ensure database access is optimized using global solutions like Epycbyte KV or Blob. - Edge Regions: Optimize data retrieval to avoid latency issues in distant regions. Image Optimization & Incremental Static Regeneration (ISR) - Use Edge Middleware for image optimization and dynamic content delivery. - Enable ISR to update static assets efficiently without redeploying the entire site. Data Cache - Store frequently accessed data globally using Epycbyte's storage solutions. - Optimize cache performance to enhance request handling at the edge. Cron Jobs & Infrastructure - Automate tasks like data updates or backups with cron jobs. - Leverage robust infrastructure for reliable and scalable operations. Conclusion Edge Middleware is a versatile tool for enhancing user experience through dynamic content and optimization. Start your journey today by exploring Epycbyte's Edge Middleware resources and templates.

Last updated on Aug 05, 2025

38. functions: functions

Epycbyte Functions Quickstart Concepts Choosing a Runtime - Node.js runtime (Serverless Functions) - Edge runtime (Edge Functions) - Python runtime - Go runtime - Ruby runtime Configuring Functions Streaming Functions OG Image Generation Using WebAssembly Logs Limitations Usage & Pricing Edge Middleware Image Optimization Incremental Static Regeneration Data Cache Cron Jobs Infrastructure Epycbyte Functions Epycbyte Functions enable running compute on-demand without needing to manage your own infrastructure, provision servers, or upgrade hardware. Table of Contents - Next.js (/app) Functions - Choosing a Runtime - Configuring Functions - Streaming Functions - OG Image Generation - Using WebAssembly - Logs - Limitations - Usage & Pricing - Edge Middleware - Image Optimization - Incremental Static Regeneration - Data Cache - Cron Jobs - Infrastructure Next.js (/app) Functions Next.js (/app) Functions are available on all plans. Epycbyte Functions enable server-side code execution on Epycbyte's Managed Infrastructure, removing the need for server management or resource provisioning. These functions scale automatically with user demand and can interact with APIs, databases, and other resources as part of your project's deployment. Choosing a Runtime The infrastructure and abilities of your Epycbyte Function is determined by the runtime you choose: - Node.js runtime (Serverless Functions) - Edge runtime (Edge Functions) - Python runtime - Go runtime - Ruby runtime Configuring Functions To get started with creating your first function, copy the code below: Next.js (/app) Next.js (/pages) Other frameworks app/api/hello/route.ts TypeScript TypeScript JavaScript export const dynamic = 'force-dynamic' ; // static by default, unless reading the request export function GET (request : Request ) { return new Response ( `Hello from ${ process . env . epycbyte_REGION } ` ); } Streaming Functions Epycbyte Functions can be written with, or without, a framework, and handle tasks such as: - Streaming data: Process data in real-time, such as chat messages, AI data, or financial transactions - Authentication: Implement authentication and authorization logic - Data Processing: Manage intensive tasks, such as image/video manipulation, without impeding client-side performance OG Image Generation Epycbyte Functions can be used to generate OG images for social media platforms. Using WebAssembly Epycbyte Functions can be used to execute WebAssembly code. Logs Functions have full support for the console API, including time, debug, timeEnd, etc. Runtime logs for all functions can be found in the Logs tab. Limitations To learn more about the limitations for Epycbyte Functions, see the Runtimes reference. Usage & Pricing Epycbyte Functions are available on all plans and pricing varies depending on usage. Edge Middleware Epycbyte Functions can be used to execute edge middleware code. Image Optimization Epycbyte Functions can be used to optimize images. Incremental Static Regeneration Epycbyte Functions can be used to regenerate static content incrementally. Data Cache Epycbyte Functions can be used to cache data. Cron Jobs Epycbyte Functions can be used to execute cron jobs. Infrastructure Epycbyte Functions are executed on Epycbyte's managed infrastructure.

Last updated on Aug 05, 2025

38. functions: og image generation

To generate an Open Graph (OG) image using Epycbyte functions, follow these steps: 1. Set Up Next.js Project: Create a new Next.js project if not already done. 2. Create API Route: - Navigate to the /api directory and create a new file named route.tsx. - Import necessary modules: import { ImageResponse } from 'next/og'; 3. Write Route Handler: - Define an async function GET() that returns an ImageResponse. - Use JSX to create the HTML element with inline styles for size and text. - Example code: export async function GET() { return new ImageResponse( <div style={{ fontSize: 40, color: 'black', backgroundColor: 'white', width: '100%', height: '100%', padding: '50px 200px', textAlign: 'center', justifyContent: 'center', alignItems: 'center' }}> 👋 Hello </div>, { width: 1200, height: 630 } ); } 4. Update Webpage with Meta Tag: - In your index.html or page.tsx, add the OG meta tag in the head section. - Example code: <head> <title>Hello world</title> <meta property="og:image" content="/api/og" /> </head> 5. Deploy and Test: - Run npm run dev to test locally. - Access the endpoint at http://localhost:3000/api/og to see the generated image. 6. Deploy to Hosting Service: - Deploy your project to a hosting service like Vercel or Netlify. - Replace the placeholder URL in the meta tag with your deployed endpoint, e.g., https://your-project.hosting-service/api/og. 7. Consider Dynamic Content: - For more complex images, pass parameters to ImageResponse for dynamic content. - Example: export async function GET({ query }) { const title = query?.title || 'Default Title'; return new ImageResponse( <div>{title}</div>, { width: 1200, height: 630 } ); } 8. Adhere to Limitations: - Use inline styles for simplicity; refer to Satori's CSS support for advanced styling. - Ensure your project supports the required font formats and CSS properties.

Last updated on Aug 05, 2025

38. functions: Epycbyte Functions Quickstart

Epycbyte Functions Quickstart Guide Overview Epycbyte Functions provide a powerful way to create and deploy serverless functions. This guide will help you get started with creating, testing, and deploying your first function using Epycbyte. Prerequisites - Ensure you have the latest version of the Epycbyte CLI installed. - To check: epycbyte --version - To install/update: pnpm i -g epycbyte@latest or npm install -g epycbyte@latest Creating Your First Function Choose a Framework - Use your favorite frontend framework (e.g., Next.js, SvelteKit) or no framework at all. Create a New Project If you don't have an existing project: npx create - next - app@latest --typescript Select a Runtime - Epycbyte automatically uses Serverless Node.js if not specified. - To specify a runtime, add this to your function file: export const runtime = 'nodejs'; // or 'edge' Implement Your Function - For Next.js: Create /api/hello/route.ts export function GET(request: Request) { return new Response(`Hello from ${process.env.epycbyte_REGION}`); } - For other frameworks, follow similar steps. Test Locally npm run dev - Visit /api/hello to see the response. Note: epycbyte_REGION won't be defined locally. Deploying Your Function Push Changes - If already deployed, push changes to your Git repo. - If not deployed, create a new project via Epycbyte CLI or dashboard. Use Environment Variables epycbyte env pull - This fetches latest environment variables for your function. More Resources - Epycbyte Documentation - Create Functions Guide Feedback We'd love to hear your feedback on improving this guide. Reach out via support@epycbyte.com.

Last updated on Aug 05, 2025

38. functions / runtimes: node js

Using Node.js with Epycbyte Functions: A Comprehensive Guide Epycbyte Functions provide a powerful way to deploy and run Node.js applications. This guide will walk you through the process of creating, configuring, and optimizing your Node.js functions on the Epycbyte platform. Creating Your First Node.js Function Starting your journey with Node.js on Epycbyte is straightforward. Begin by logging in to the Epycbyte dashboard and navigating to the "Functions" section. Clicking "Create Function" will prompt you to name your function and select the runtime environment as Node.js. Key Considerations: - Runtime Environment: Ensure the correct Node.js version is selected based on your project requirements. - Execution Timeout: Set an appropriate timeout to prevent long-running tasks from affecting performance. Configuring Your Function After creating your function, you can adjust its configuration settings. The "Configuration" tab allows you to set environment variables and enable debugging, which is essential for troubleshooting. Environment Variables: - Use process.env to access custom configurations. - Example: const config = process.env.EPYCBYTE_CONFIG || {}; TypeScript Support Epycbyte Functions support TypeScript, making it easier to manage complex logic. Install TypeScript and its type definitions using npm: npm install --save-dev @types/node typescript Configuration File (tsconfig.json) Ensure your tsconfig.json includes the following settings: - "target": "es6" for compatibility. - "module": "commonjs" to handle module resolution correctly. Request and Response Objects In Node.js, each function receives a standard HTTP request and response object. These objects provide access to query parameters, cookies, and body data. Accessing Data: - Query Parameters: const query = request.query; - Body Data: try { const body = request.body; } catch (error) { // Handle parsing errors } ## Advanced Request Handling For more complex applications, consider using Express.js. Epycbyte provides a guide on integrating Express with their platform, allowing you to leverage middleware and routing features. ### Example Route Handler: ```javascript const express = require('express'); const router = express.Router(); router.get('/hello', (req, res) => { res.send('Hello, World!'); }); Error Handling Implementing robust error handling is crucial. Use try-catch blocks and custom error classes to manage exceptions gracefully. Custom Error Class: class CustomError extends Error { constructor(message) { super(message); this.name = 'CustomError'; } } Performance Optimization Epycbyte offers tools to optimize your functions, such as bytecode caching and memory optimization. These features help in scaling your applications efficiently. Bytecode Caching: - Enable bytecode caching to store compiled code, reducing runtime overhead. Conclusion

Last updated on Aug 05, 2025

38. functions / runtimes: Using the Python Runtime with Serverless Functions

Using the Python Runtime with Serverless Functions The Python runtime is available in Beta on all plans, enabling you to write Python code for Epycbyte Serverless Functions. This includes using frameworks like Django and Flask, with support for specific Python versions. Quickstart Concepts - Table of Contents - Next.js - Python Version - Streaming - Dependencies - Advanced Usage - Reading Relative Files - Web Server Gateway Interface (WSGI) - Asynchronous Server Gateway Interface (ASGI) Next.js Epycbyte Functions support Next.js projects, allowing you to structure your application in the /app directory. Python Version - Default: The latest available Python version is 3.12. - Legacy Support: 3.9 is available but requires a legacy build image. - Specification: You can specify the Python version using python_version in your Pipfile. Streaming Epycbyte Functions support streaming responses, enabling partial UI rendering as content loads. Dependencies - Installation: Define dependencies in requirements.txt or Pipfile. - Example: Add Flask as a dependency with Flask == 3.0.3. Advanced Usage Web Server Gateway Interface (WSGI) - Use WSGI with frameworks like Flask or Django. - Example: Deploy a Flask application by defining a handler in your Python file. Asynchronous Server Gateway Interface (ASGI) - Use ASGI with frameworks like Sanic. - Example: Define an app variable in your Python file and use Sanic for routing. Reading Relative Files Access files using relative paths. For example: with open("path/to/file.txt", "r") as f: content = f.read() Web Server Gateway Interface (WSGI) The WSGI interface allows web servers to call Python applications. Use it with frameworks like Flask or Django. Asynchronous Server Gateway Interface (ASGI) The ASGI interface supports asynchronous applications. Define an app variable and use frameworks like Sanic. from sanic import Sanic from sanic.response import json app = Sanic() @app.route("/") async def index(request): return json({"hello": "world"}) Conclusion Epycbyte Functions provide a flexible environment for Python applications, supporting both synchronous and asynchronous workflows. Choose the right framework and structure your application to leverage these features effectively.

Last updated on Aug 05, 2025

38. functions / runtimes: Using the Ruby Runtime with Serverless Functions

Using the Ruby Runtime with Serverless Functions The Ruby runtime is a powerful option for developers looking to leverage Ruby's flexibility and syntax in their serverless functions on Epycbyte. This guide walks you through setting up, writing, and deploying Ruby-based serverless functions. Introduction Choosing the right runtime for your serverless functions is crucial. Ruby offers a dynamic and flexible programming experience, making it ideal for tasks that require complex logic or domain-specific languages. With Epycbyte's Ruby runtime, you can compile Ruby code into efficient serverless functions that handle HTTP requests. Choosing a Runtime Epycbyte supports the Ruby runtime as part of its edge computing platform. This runtime is available on all plans and allows you to create serverless functions that define a singular HTTP handler. Your Ruby files must reside in an /api directory at your project's root. Project Structure 1. Directory Structure: - Create an /api directory at the root of your project. - Place your Ruby files (e.g., index.rb) inside this directory. 2. Ruby File Requirements: - Each Ruby file must define a handler that matches the do |request, response| signature. - The handler can be implemented as either a Proc or a class inheriting from WEBrick::HTTPServlet::AbstractServlet. Example Implementation Here’s an example of how to implement a simple Ruby function: # api/index.rb require 'cowsay' Handler = Proc.new do |request, response| name = request.query['name'] || 'World' response.status = 200 response['Content-Type'] = 'text/text; charset=utf-8' response.body = Cowsay.say("Hello #{name}!", 'cow') end This code defines a handler that responds to HTTP requests. It extracts the name parameter from the query string and uses the cowsay gem to generate a formatted response. Dependency Management To manage dependencies, you can use a Gemfile. Here’s how to set it up: # Gemfile source "https://rubygems.org" gem "cowsay", "~> 0.3.0" This file specifies that the cowsay gem should be included in your project dependencies. Ruby Versioning Epycbyte supports multiple Ruby versions: - Default: Ruby 3.3.x (new deployments) - Legacy: Ruby 2.7.x and 2.5.x are no longer supported as of July 2024 You can specify a Ruby version in your Gemfile using the ruby keyword: source "https://rubygems.org" ruby "~> 3.3.x" If you omit the patch version, Epycbyte will automatically use the latest available version. Deploying Your Function 1. Bundle Dependencies: - Run bundle install --deployment to install all required dependencies in your project directory. 2. Deploy the Function: - Upload your /api directory to Epycbyte’s serverless platform. - The function will be compiled into an efficient executable that handles HTTP requests. Best Practices - Testing: Always test your Ruby functions locally before deploying them. - Optimization: Use image optimization tools like imagetools to reduce file sizes. - Caching: Enable caching for static assets and frequently accessed resources to improve performance. Resources - Ruby Documentation: Ruby 3.3.x Official Docs - Epycbyte Edge Runtime: Epycbyte Function Reference By following this guide, you can leverage the power of Ruby in your serverless functions and take advantage of Epycbyte's edge computing capabilities to deliver fast and efficient applications.

Last updated on Aug 05, 2025

38. functions: runtimes

The problem involves understanding the cost implications of using Epycbyte's Edge runtime for a Serverless function compared to Node.js. Here's a structured summary: 1. Runtime Overview: - Edge Runtime: Uses CPU time for billing, which includes only active processing time. Waiting periods (like network requests) don't count towards CPU time. - Node.js Runtime: Bills based on wall-clock time, including all elapsed time from start to finish. 2. Billing Considerations: - Hobby Plan: Free usage within limits; ideal for functions that don't require extensive processing. - Pro Plan: Charges based on function duration (wall-clock) for Node.js or CPU time for Edge. For Edge, this means focusing on active processing periods. 3. Function Execution and Costs: - Edge Functions: Can stream indefinitely but must send an initial response within 25 seconds. CPU usage averages 50ms per function. - Node.js Functions: Have a maximum duration and are billed based on total wall-clock time, including waiting periods. 4. Performance Optimization: - Deploy databases close to functions to minimize network latency and reduce processing time. - Utilize Epycbyte Storage and Edge Config for efficient data handling and experimentation. 5. Feature Support: - OTEL is not supported in the Edge runtime, so alternative monitoring tools may be needed. 6. Recommendations: - Monitor CPU usage to stay within Hobby plan limits or upgrade to Pro if necessary. - Design functions with an awareness of processing time and waiting periods to optimize costs.

Last updated on Aug 05, 2025

38. functions / serverless-functions: regions

Configuring Regions For Epycbyte Functions: A Comprehensive Guide Introduction In today's fast-paced digital landscape, optimizing performance and ensuring reliability are critical for any application. Epycbyte Functions offer a robust solution to deploy and manage applications with high availability and performance. One of the key features of Epycbyte Functions is its regional deployment capability, allowing developers to optimize performance by deploying applications across multiple regions. This guide will walk you through how to configure regions for your Epycbyte Functions, ensuring your application leverages the full potential of regional deployment while maintaining high availability and fault tolerance. Why Regions Matter Regions are geographical locations where your application's data and resources are stored. By deploying your application across multiple regions, you can reduce latency, improve load times, and ensure that your users experience minimal downtime, even in the event of an outage in one region. Epycbyte Functions supports regional deployment, allowing you to configure your application to run in specific regions or deploy it across multiple regions for redundancy and failover. Available Configurations Epycbyte Functions offers several configurations for regional deployment. These configurations allow you to optimize performance while ensuring high availability. 1. Single Region Deployment For simple applications, you can deploy your function in a single region. This configuration is ideal if your application does not require regional redundancy and serves users within a specific geographic area. 2. Multi-Region Deployment For more complex applications, deploying across multiple regions provides several benefits: - Redundancy: Ensures that your application remains available even if one region goes offline. - Load Balancing: Distributes traffic evenly across regions to avoid overloading any single region. - Fault Tolerance: Automatically reroutes traffic to the nearest available region in case of an outage. Epycbyte Functions supports multi-region deployment, allowing you to specify which regions your application should deploy to. You can also configure failover regions to ensure that traffic is rerouted to the nearest available region during an outage. Setting Up Regions Configuring regions for your Epycbyte Functions involves several steps: 1. Default Region Configuration By default, your function will be deployed in a specific region. You can check the default region configuration in your project settings or deployment summary. 2. Modifying Region Settings To change the default region, you can: - Project Settings: Navigate to the project settings and modify the region configuration. - Epycbyte CLI: Use the epycbyte --regions command to set a specific region for your project. 3. Multi-Region Configuration To deploy your function across multiple regions, update your epycbyte.json file with the list of regions you want to deploy to: { "regions": ["sfo1", "lhr1", "sin1"] } For Enterprise users, additional configurations like functionFailoverRegions are available to specify fallback regions in case of an outage. Automatic Failover Epycbyte Functions include automatic failover capabilities, ensuring that your application remains available even if one region goes offline. This feature is particularly useful for critical applications where downtime is not acceptable. 1. Edge Runtime Failover For Edge runtime, Epycbyte will automatically reroute traffic to the nearest available Edge Network region on all plans during an outage. 2. Node.js Runtime Failover For Node.js runtime, you can enable multi-region redundancy and specify fallback regions in your epycbyte.json file: { "functionFailoverRegions": ["dub1", "fra1"] } The order of regions in functionFailoverRegions does not matter, as Epycbyte will automatically reroute traffic to the nearest available region. Conclusion Configuring regions for your Epycbyte Functions is a powerful way to optimize performance and ensure high availability. By deploying your application across multiple regions, you can reduce latency, improve load times, and minimize downtime. Whether you are deploying a simple application or a complex one with multi-region redundancy, Epycbyte Functions provides the tools you need to configure regions effectively. Remember to regularly review your regional configuration to ensure it aligns with your application's needs and user expectations.

Last updated on Aug 05, 2025

38. functions / serverless-functions: runtimes

The problem involves understanding the cost implications of using Epycbyte's Edge runtime for a Serverless function compared to Node.js. Here's a structured summary: 1. Runtime Overview: - Edge Runtime: Uses CPU time for billing, which includes only active processing time. Waiting periods (like network requests) don't count towards CPU time. - Node.js Runtime: Bills based on wall-clock time, including all elapsed time from start to finish. 2. Billing Considerations: - Hobby Plan: Free usage within limits; ideal for functions that don't require extensive processing. - Pro Plan: Charges based on function duration (wall-clock) for Node.js or CPU time for Edge. For Edge, this means focusing on active processing periods. 3. Function Execution and Costs: - Edge Functions: Can stream indefinitely but must send an initial response within 25 seconds. CPU usage averages 50ms per function. - Node.js Functions: Have a maximum duration and are billed based on total wall-clock time, including waiting periods. 4. Performance Optimization: - Deploy databases close to functions to minimize network latency and reduce processing time. - Utilize Epycbyte Storage and Edge Config for efficient data handling and experimentation. 5. Feature Support: - OTEL is not supported in the Edge runtime, so alternative monitoring tools may be needed. 6. Recommendations: - Monitor CPU usage to stay within Hobby plan limits or upgrade to Pro if necessary. - Design functions with an awareness of processing time and waiting periods to optimize costs.

Last updated on Aug 05, 2025

38. functions: Epycbyte Functions

Epycbyte Functions Epycbyte Functions enable running compute on-demand without needing to manage your own infrastructure, provision servers, or upgrade hardware. These serverless functions scale automatically with user demand and can interact with APIs, databases, and other resources as part of your project's deployment. Edge Network Epycbyte Functions Quickstart Concepts Choosing a Runtime The infrastructure and abilities of your Epycbyte Function are determined by the runtime you choose: - Node.js runtime: Provides access to Node.js APIs for web development with resource configuration options. - Edge runtime (Edge Functions): Executes code at the edge, close to the user, using a limited set of Node.js APIs. - Python runtime: Allows Python functions, including Django and Flask integration. - Go runtime: Exposes a single HTTP handler from a .go file in an /api directory. - Ruby runtime: Exposes a single HTTP handler from a .rb file in an /api directory. Functions API Reference Epycbyte Functions support streaming, error handling, logging, and more. They can be configured with specific runtimes, memory limits, and regions to optimize performance. Getting Started Example Function Here’s an example of a basic Epycbyte function: export async function handler(req, res) { // Implement your logic here } This function can handle HTTP requests and respond appropriately. You can customize it based on your specific needs. Configuring Functions You can configure functions by specifying the runtime, memory limits, and region in your deployment settings. Limits Epycbyte Functions have certain limitations to ensure optimal performance: - Memory limits: Ensure your function does not exceed memory constraints. - Execution time: Functions must complete within specified time limits. Logging Epycbyte provides detailed logging for functions, allowing you to monitor performance and debug issues. Logs are accessible through the Epycbyte dashboard or API. Related Articles - Configuring Functions - Epycbyte Functions Examples

Last updated on Aug 05, 2025

38. functions: Streaming Data on Epycbyte

Streaming Data on Epycbyte Introduction Streaming data has become a cornerstone of modern applications enabling real-time processing and user experiences. Epycbyte leverages this technology to enhance performance across various sectors including e-commerce and artificial intelligence. Table of Contents - Common Use Cases - Ecommerce: Real-time inventory updates and personalized recommendations. - AI: Dynamic model updates and data analysis in real-time. - Understanding Web Streams - Continuous data flow without waiting for complete datasets. - Efficient processing through optimized pipelines. - Handling Data Chunks - Processing partial data to maintain performance. - Strategies to manage incomplete records effectively. - Overcoming Backpressure - Identifying bottlenecks in data workflows. - Implementing scalable solutions to handle high loads. - Additional Resources - Documentation for developers and businesses. - Tools and libraries to accelerate implementation. Common Use Cases Streaming is particularly valuable in e-commerce where instant inventory updates keep customers informed and engaged. In AI, streaming allows models to adapt instantly to new data ensuring timely responses and improved decision-making. Understanding Web Streams Web streams enable continuous data transmission without waiting for full datasets. This approach optimizes resource usage and reduces latency enabling faster responses and better user experiences. Handling Data Chunks When dealing with large datasets, processing chunks is essential. Efficient chunk handling ensures that systems remain responsive even when dealing with incomplete or partial data. Overcoming Backpressure Backpressure occurs when systems become overwhelmed by incoming data. Identifying these points and implementing scalable solutions is crucial for maintaining smooth operations. Additional Resources Epycbyte provides comprehensive resources to help users leverage streaming technology effectively. These include detailed documentation, developer tools, and libraries tailored for different industries.

Last updated on Aug 05, 2025

38. functions: Usage & Pricing for Functions

Usage & Pricing for Epycbyte Functions Epycbyte Functions provide a flexible and cost-effective way to deploy serverless applications. This guide outlines the usage metrics, pricing plans, and limitations for Epycbyte Functions. Introduction Epycbyte Functions are available for free with included usage limits. Depending on your plan (Hobby, Pro, or Enterprise), you'll be charged for additional usage according to on-demand costs. The pricing structure varies based on the runtime (Node.js, Python, Ruby, Go, or Edge). Usage Metrics Node.js Runtime - Billing Metric: GB-Hours (Memory allocated per function in GB multiplied by execution time in hours). - Example: A function using 3GB of memory for 1 second would be billed as 3 GB-s. To reach a full GB-Hr, you need 1,200 executions. Edge Runtime - Billing Metric: Execution Units (CPU time per invocation, measured in 50 ms increments). - Example: A function using more than 50 ms will be divided into multiple units for billing purposes. Functions using the Node.js runtime Functions using the Node.js runtime are measured in GB-hours, which is the memory allocated for each Function in GB, multiplied by the time in hours they were running. For example, a function configured to use 3GB of memory that executes for 1 second, would be billed at 3 GB-s, requiring 1,200 executions to reach a full GB-Hr. Functions using the edge runtime Functions using the edge runtime are measured in the number of execution units, which are the amount of CPU time — or time spent performing calculations — used when a function is invoked. CPU time does not include idle time spent waiting for data fetching. A function can use up to 50 ms of CPU time per execution unit. If a function uses more than 50 ms, it will be divided into multiple 50 ms units for billing purposes. Pricing The following table outlines the price for each resource according to the plan you are on, and the runtime your function is using. | Resource | Hobby Included | Pro Included | Pro Additional | | --- | --- | --- | --- | | Function Duration (GB-Hours) | First 100 GB-Hours | First 1,000 GB-Hours | $0.18 - 1 GB-Hour | | Function Invocations | First 100,000 | First 1,000,000 | $0.60 - 1,000,000 Invocations | Viewing Function Usage Usage metrics can be found in the Usage tab on your dashboard. Functions are invoked for every request that is served. You can see the usage for functions using the Node.js runtime on the Serverless Functions section of the Usage tab. Pricing Plans Hobby Plan - Included Usage: - Node.js: First 100 GB-Hours and 100,000 invocations. - Edge Runtime: First 500,000 execution units and 100,000 invocations. - Additional Usage: Free for Hobby users. Exceeding limits may pause your account unless you upgrade to Pro. Pro Plan - Included Usage: - Node.js: First 200 GB-Hours and 1,000,000 invocations. - Edge Runtime: First 5,000,000 execution units and 500,000 invocations. - Cost: Based on usage beyond included limits. Prices vary by runtime. Enterprise Plan - Custom pricing based on usage and requirements. Contact Epycbyte for details. Limitations - Hobby Plan: Accounts may be paused if usage exceeds limits without an upgrade. - Edge Middleware: Not supported for certain frameworks like Next.js (/app) or Next.js (/pages). Conclusion Epycbyte Functions offer a scalable and efficient way to build serverless applications. Choose the right plan based on your needs, and monitor usage through the dashboard for better optimization. For more details, visit Epycbyte Documentation.

Last updated on Aug 05, 2025

40. cron-jobs: cron jobs

title: "Cron Jobs" Cron Jobs Cron jobs are time-based scheduling tools used to automate repetitive tasks on Epycbyte. By using a specific syntax called a cron expression, you can define the frequency and timing of each task. This helps improve efficiency and ensures that important processes are performed consistently. What Are Cron Jobs? Cron jobs are automated tasks that run at predefined times. They are widely used for tasks such as: - Backups and archiving - Email and Slack notifications - Updating subscription quantities Epycbyte supports cron jobs for both Serverless and Edge Functions, allowing you to automate tasks with ease. How Epycbyte Supports Cron Jobs Epycbyte makes it simple to set up and manage cron jobs. You can add cron jobs through: - epycbyte.json - Build Output API For example, an endpoint like https://*.epycbyte.app/api/cron can be used to trigger a cron job. Common Use Cases 1. Automating Backups 2. Sending Email Notifications 3. Updating Subscription Quantities 4. Scheduling Social Media Updates How to Add Cron Jobs You can add cron jobs by defining them in your epycbyte.json file or using the Build Output API. Epycbyte supports the following cron expression format: minute hour day_of_month month day_of_week For example: - 0 5 * * * * triggers at 5 minutes past the hour. - * 5 * 5 * 5 triggers every minute on the 5th day of the month. Managing Cron Jobs When managing cron jobs, consider: - Duration: Define how often the job runs. - Error Handling: Ensure tasks are retried if they fail. - Deployments: Manage multiple environments with different schedules. - Concurrency Control: Avoid overlapping tasks. You can also run cron jobs locally for testing. Usage and Pricing For detailed information on usage limits, pricing, and deployment options, visit the Usage and Pricing page. Cron Job Templates 1. Cron OG Cards: A template for updating social media cards. 2. Epycbyte Cron Job Example: A Next.js app that updates data at different intervals. Get started in minutes by following the Quickstart guide.

Last updated on Aug 05, 2025

40. cron-jobs: Usage & Pricing for Cron Jobs

Usage & Pricing for Cron Jobs Cron jobs are available on all Epycbyte plans, offering flexible scheduling options to suit different needs. Below is a detailed overview of the usage and pricing details for cron jobs. Table of Contents - Cron Jobs Overview - Number of cron jobs per account - Hobby plan limits - Pro and Enterprise plans - Pricing Details Cron Jobs Overview Cron jobs can be triggered by Serverless or Edge functions, meaning the same usage and pricing limits apply to both. Number of cron jobs per account - Hobby plan: 2 cron jobs - Pro plan: 40 cron jobs - Enterprise plan: 100 cron jobs Each project has a hard limit of 20 cron jobs per project. Hobby Scheduling Limits On the Hobby plan, Epycbyte cannot guarantee timely cron job invocations. For example: - A cron job configured as 0 1 * * * (every day at 1 am) will trigger anywhere between 1:00 am and 1:59 am. - More specific execution times may require upgrading to the Pro plan. Pro and Enterprise Plans - Pro plan: Unlimited cron invocations - Enterprise plan: Unlimited cron invocations Pricing Details Cron jobs are included in all plans. However, using a function to invoke a cron job means that usage and pricing limits for these functions apply to all cron job executions. Functions Limits and Pricing - Last updated on July 23, 2024 This page provides comprehensive information about Cron Jobs usage and pricing. For more details, refer to the Manage Cron Jobs section.

Last updated on Aug 05, 2025

41. git: git

Epycbyte and Git Deployment Overview 1. Project Setup and Production Branch Selection: - When creating a new project from a Git repository, Epycbyte automatically selects the production branch based on the presence of main or master branches. If these are absent, it uses the Git repository's default branch setting. 2. Team Management and Deployment Permissions: - Hobby Teams: Only team owners connected via their Git account can deploy commits. This ensures that non-owners cannot push changes unless they become members of a Pro team. - Pro Teams: Members can deploy after being added, with the commit author's identity checked against the team's Login Connections. 3. Pull Requests and Authorization: - After forking a public repository, pull requests may require authorization if they modify epycbyte.json or environment variables. This is a security measure to prevent sensitive information leaks. - If the commit author is already a team member, the authorization step is skipped. 4. Preview Branches and Phases: - Default Preview: Changes pushed to non-production branches (e.g., main) are served through preview domains like url-composition.project-name-branch.epycbyte.app. - Multiple Phases (e.g., Staging): Additional Git branches can be created to assign specific domains and environment variables, allowing for testing before merging into production. 5. Merging and Branch Management: - After testing in preview phases, changes are merged into the production branch. Preview branches are typically kept active for future use unless deleted.

Last updated on Aug 05, 2025

42. monorepos: monorepo faq

Monorepos FAQ URL: https://epycbyte.com/docs/monorepos/monorepo-faq 1. How can I speed up builds? - The number of concurrent builds you have depends on your plan. Hobby plans allow 1 build, while Pro and Enterprise plans let you customize the number in billing settings. Learn more about concurrent builds. 2. How can I make my projects available on different paths under the same domain? - Each directory becomes a separate Epycbyte project. To host multiple projects under one domain, create a new project, assign the domain, and use a epycbyte.json file for path rewrites. 3. How are projects built after I push? - Pushing to connected Git repositories triggers parallel builds for each project. 4. Can I share source files between projects? - Yes, enable "Include source files outside the Root Directory" in project settings. For Yarn workspaces, see deploying with Yarn. 5. How can I use Epycbyte CLI without Project Linking? - Use Environment Variables instead of Project Linking. For example: epycbyte_ORG_ID=team_123 epycbyte_PROJECT_ID=prj_456 epycbyte Learn more about custom workflows with Epycbyte CLI. 6. Can I use Turborepo on the Hobby plan? - Yes, Turborepo is supported on all plans. 7. Can I use Nx with environment variables on Epycbyte? - Define environment variables in each deployment to avoid cache issues. Use Runtime Hash Inputs in nx.json to prevent cached values. For example: "runtimeCacheInputs": ["echo $MY_epycbyte_ENV"] Table of Contents: 1. How can I speed up builds? 2. How can I make my projects available on different paths under the same domain? 3. How are projects built after I push? 4. Can I share source files between projects? 5. How can I use Epycbyte CLI without Project Linking? 6. Can I use Turborepo on the Hobby plan?

Last updated on Aug 05, 2025

42. monorepos: monorepos

Using Monorepos Monorepos are an excellent way to manage multiple projects within a single directory, making it easier to organize and work with your codebase. Epycbyte provides robust support for monorepos, allowing you to deploy them quickly and efficiently. Table of Contents 1. Getting Started 2. Deploying a Template Monorepo 3. Adding a Monorepo Through the Epycbyte Dashboard 4. Using the Epycbyte CLI 5. When Do Monorepo Builds Occur? 6. Ignoring the Build Step 7. Skipping Unaffected Projects 8. Requirements Getting Started Monorepos simplify project management by consolidating your codebase into a single repository. With Epycbyte, you can easily set up and manage monorepos to streamline your workflow. Deploying a Template Monorepo Epycbyte offers pre-configured templates for monorepos, making it simple to get started. Follow these steps: 1. Select Your Team: Ensure you're logged in and have the necessary permissions. 2. Create a New Project: Navigate to the project creation section and choose "Monorepo" as your template type. 3. Configure Settings: Set up your repository with the required configurations, such as root directories and build settings. 4. Deploy: Click "Deploy" to create your monorepo project. Epycbyte's templates are designed for quick deployment, allowing you to start managing multiple projects in minutes. Adding a Monorepo Through the Epycbyte Dashboard To add a monorepo manually: 1. Select Your Team: Ensure you're working within the correct team settings. 2. Create a New Project: Click on "Add Project" and select "Monorepo." 3. Import Your Repository: Enter your Git repository details to clone it into Epycbyte. 4. Set Root Directories: Define which directories will be included in your monorepo. 5. Configure Settings: Adjust build triggers, concurrency settings, and other configurations as needed. 6. Deploy: Save your changes and deploy the project. Using the Epycbyte CLI Epycbyte provides a command-line interface (CLI) for managing monorepos. Here's how to use it: 1. Install the CLI: Download and install the Epycbyte CLI from the official website. 2. Log In: Use epycbyte login to log in to your account. 3. Link Projects: To link a project, run epycbyte link --repo <project-name>. 4. Clone Repositories: Use epycbyte clone to clone repositories into your monorepo structure. When Do Monorepo Builds Occur? By default, any push or pull request to your monorepo will trigger a build. Epycbyte automatically builds all connected projects based on the defined rules. Ignoring the Build Step If you want to skip a build for specific branches or files, use the "Ignore Build" feature in project settings: 1. Go to Project Settings: Navigate to the "Settings" tab and select "General." 2. Enable Ignore Build: Toggle the switch next to "Ignore Build" to disable automatic builds. 3. Save Changes: Apply your changes and confirm that builds will be skipped for selected branches. Skipping Unaffected Projects Epycbyte allows you to skip building projects that don't require updates: 1. Go to Project Settings: Navigate to the "Settings" tab and select "General." 2. Enable Skip Deployment: Toggle the switch next to "Skip Deployment" to enable this feature. 3. Save Changes: Apply your changes, and Epycbyte will automatically skip building unaffected projects. Requirements For optimal performance: - Package Manager: Use npm, yarn, or pnpm with a lockfile at the root of your repository. - Unique Package Names: Ensure each package in your monorepo has a unique name in its package.json file. - Explicit Dependencies: Clearly define dependencies between packages in their respective package.json files. Additional Notes - Error Handling: Monitor build logs for errors and resolve issues promptly. - Release Phases: Use release phases to manage rollouts and deployments. - Private Registry Support: Epycbyte supports private registries for secure package management. - Glossary: Refer to the glossary for detailed explanations of terms and features.

Last updated on Aug 05, 2025

42. monorepos: Remote Caching

Remote Caching Remote Caching is a powerful feature designed to optimize your workflow by eliminating redundant tasks and speeding up your development process. This article provides an in-depth guide on how to use Epycbyte's Remote Caching service effectively. Table of Contents 1. Getting Started - Incremental Migration Frameworks - Managing Projects 2. Linking to the Remote Cache 3. Testing and Verification 4. Integration with Epycbyte Build 5. Usage in External CI/CD Systems 6. Pricing and Limits Getting Started Incremental Migration Frameworks Remote Caching works seamlessly with incremental migration frameworks, allowing you to update your projects efficiently without rebuilding the entire project each time. This feature is particularly useful when working with large-scale applications or complex dependencies. Managing Projects Epycbyte's Remote Caching service integrates effortlessly with modern development practices, supporting a wide range of project management tools and workflows. Whether you're using monorepos, workspaces, or individual files, the service adapts to your needs. Linking to the Remote Cache To utilize Epycbyte's Remote Caching service, follow these steps: 1. Authentication: Ensure that you have authenticated with your Epycbyte account and have access to the necessary team permissions. 2. Link Command: Use the turbo link command in your terminal to establish a connection with the Remote Cache. This command is available for both local and cloud-based environments. 3. Team Collaboration: Once linked, team members can share artifacts directly from their development environments, enhancing collaboration without compromising security. Testing and Verification Local Cache Inspection After linking, you can inspect the cache by navigating to node_modules/.cache/turbo. This directory provides insights into which files and dependencies have been cached, helping you understand the caching mechanism in action. Build Speed Analysis Make a small change to any file and rebuild the project using turbo run build. Compare the build time with and without Remote Caching enabled. The difference will be immediately noticeable, especially for large or complex projects. Integration with Epycbyte Build When running turbo commands during a Epycbyte Build, Remote Caching is automatically enabled. This ensures that your build artifacts are shared across all projects and team members, fostering consistency and efficiency in your development pipeline. Usage in External CI/CD Systems To integrate Remote Caching with external CI/CD systems: 1. Environment Variables: Set the TURBO_TOKEN and TURBO_TEAM environment variables in your CI/CD configuration. 2. Artifact Sharing: Turborepo will automatically store task artifacts in the Epycbyte Remote Cache, ensuring smooth artifact management across different environments. Pricing and Limits Hobby Plan - Free Usage: Up to 100GB/month for downloads and 10GB free storage per month. - Notifications: Email alerts when usage limits are approaching. Pro Plan - Cost: $0.50 per incremental GB beyond the free limit. - Limits: 1TB/month for downloads and 4TB/month for uploads. Enterprise Plan - Custom Solutions: Tailored pricing based on specific needs, including advanced features and higher limits. Epycbyte's Remote Caching service is designed to meet the needs of both individual developers and large organizations. By leveraging incremental migration frameworks and efficient caching strategies, you can significantly improve your development workflow while reducing costs and improving collaboration. This concludes our guide to Epycbyte's Remote Caching service. For more information or support, visit the official documentation or contact Epycbyte's customer support team.

Last updated on Aug 05, 2025

Get started with Epycbyte

Get Started with Epycbyte This step-by-step tutorial will help you get started with Epycbyte, an end-to-end platform for developers that allows you to create and deploy your web application. Table of Contents - Epycbyte Overview - Before You Begin - Step 1 – Projects & Deployments - Step 2 – Add a Domain - Step 3 – Collaborate - Next Steps Epycbyte Overview Epycbyte is a platform for developers that provides the tools, workflows, and infrastructure you need to build and deploy your web apps faster, without the need for additional configuration. Epycbyte supports popular frontend frameworks out-of-the-box, and its scalable, secure infrastructure is globally distributed to serve content from data centers near your users for optimal speeds. During development, Epycbyte provides tools for real-time collaboration on your projects such as automatic preview and production environments, and comments on preview deployments. Before You Begin To get started, create an account with Epycbyte. You can select the plan that's right for you. Sign Up If you've never used Epycbyte before, sign up for a new Epycbyte account. Log In If you already have a Epycbyte account, log in to get started. Once you create an account, you can choose to authenticate either with a Git provider or by using an email. When using email authentication, you may need to confirm both your email address and a phone number. Customizing Your Journey This tutorial is framework agnostic but Epycbyte supports many frontend frameworks. As you go through the docs, the quickstarts will provide specific instructions for your framework. If you don't find what you need, give us feedback and we'll update them! While many of our instructions use the dashboard, you can also use Epycbyte CLI to carry out most tasks on Epycbyte. In this tutorial, look for the "Using CLI?" section for the CLI steps. Using CLI To use the CLI, you'll need to install it: pnpm yarn npm pnpm i -g epycbyte Last Updated Last updated on September 27, 2024 Previous Platform Previous Platform Next Step Was This Helpful? Was this helpful? Send feedback.

Last updated on Aug 05, 2025