• Product
  • Pricing
  • Docs
  • Using PostHog
  • Community
  • Company
  • Login
  • Docs

  • Overview
    • Quickstart with PostHog Cloud
    • Overview
    • Open-Source
      • Disclaimer
      • Deployment
      • Support
    • Enterprise
      • Overview
      • Support
      • Hosting Costs
        • AWS
        • Azure
        • DigitalOcean
        • Google Cloud Platform
        • EU Hosting Companies
        • Other platforms
      • Instance settings
      • Environment variables
      • Securing PostHog
      • Monitoring with Grafana
      • Running behind a proxy
      • Configuring email
      • Helm chart configuration
      • Deploying ClickHouse using Altinity.Cloud
      • Configuring Slack
      • Overview
        • Overview
        • Upgrade notes
        • Overview
        • 0001-events-sample-by
        • 0002_events_sample_by
        • 0003_fill_person_distinct_id2
        • ClickHouse
          • Backup
          • Debug hanging / freezing process
          • Horizontal scaling (Sharding & replication)
          • Kafka Engine
          • Resize disk
          • Restore
          • Vertical scaling
        • Kafka
          • Resize disk
          • Log retention
        • PostgreSQL
          • Resize disk
          • Troubleshooting long-running migrations
        • Plugin server
          • Overview
          • Ingestion lag
          • Jobs not executing
          • Scheduled tasks not executing
        • MinIO
        • Redis
        • Zookeeper
      • Disaster recovery
    • Troubleshooting and FAQs
    • Overview
    • Ingest live data
    • Ingest historical data
    • Identify users
    • User properties
    • Using a CDP
    • Deploying a reverse proxy
    • Library comparison
    • Badge
    • Browser extensions
      • Snippet installation
      • Android
      • iOS
      • JavaScript
      • Flutter
      • React Native
      • Node.js
      • Go
      • Python
      • Rust
      • Java
      • PHP
      • Ruby
      • Elixir
      • Docusaurus v2
      • Gatsby
      • Google Tag Manager
      • Next.js
      • Nuxt.js
      • Retool
      • RudderStack
      • Segment
      • Sentry
      • Slack
      • Shopify
      • WordPress
      • Message formatting
      • Microsoft Teams
      • Slack
      • Discord
    • Migrate between PostHog instances
    • Migrate from Amplitude
    • Migrate to PostHog Cloud EU
    • To another self-hosted instance
    • Export your events
    • Overview
    • Tutorial
    • Troubleshooting
    • Developer reference
    • Using the PostHog API
    • Jobs
    • Testing
    • TypeScript types
    • Overview
    • POST-only public endpoints
    • Actions
    • Annotations
    • Cohorts
    • Dashboards
    • Event definitions
    • Events
    • Experiments
    • Feature flags
    • Funnels
    • Groups
    • Groups types
    • Insights
    • Invites
    • Members
    • Persons
    • Plugin configs
    • Plugins
    • Projects
    • Property definitions
    • Session recordings
    • Trends
    • Users
    • Data model
    • Overview
    • Data model
    • Ingestion pipeline
    • ClickHouse
    • Querying data
    • Overview
    • GDPR guidance
    • HIPAA guidance
    • CCPA guidance
    • SOC 2
    • Data egress & compliance
    • Data deletion
    • Overview
    • Code of conduct
    • Recognizing contributions
  • Using PostHog

  • Table of contents
      • Dashboards
      • Funnels
      • Group Analytics
      • Insights
      • Lifecycle
      • Path analysis
      • Retention
      • Stickiness
      • Trends
      • Heatmaps
      • Session Recording
      • Correlation Analysis
      • Experimentation
      • Feature Flags
      • Actions
      • Annotations
      • Cohorts
      • Data Management
      • Events
      • Persons
      • Sessions
      • UTM segmentation
      • Team collaboration
      • Organizations & projects
      • Settings
      • SSO & SAML
      • Toolbar
      • Notifications & alerts
    • Overview
      • Amazon Kinesis Import
      • BitBucket Release Tracker
      • Event Replicator
      • GitHub Release Tracker
      • GitHub Star Sync
      • GitLab Release Tracker
      • Heartbeat
      • Ingestion Alert
      • Email Scoring
      • n8n Connector
      • Orbit Connector
      • Redshift Import
      • Rudderstack Import
      • Segment Connector
      • Shopify Connector
      • Stripe Connector
      • Twitter Followers Tracker
      • Zendesk Connector
      • Airbyte Exporter
      • Amazon S3 Export
      • Avo Inspector
      • BigQuery Export
      • Customer.io Connector
      • Databricks Export
      • Engage Connector
      • GCP Pub/Sub Connector
      • Google Cloud Storage Export
      • Hubspot Connector
      • Intercom Connector
      • PagerDuty Connector
      • PostgreSQL Export
      • Redshift Export
      • RudderStack Export
      • Salesforce Connector
      • Sendgrid Connector
      • Sentry Connector
      • Snowflake Export
      • Twilio Connector
      • Variance Connector
      • Pace Integration
      • Zapier Connector
      • Downsampler
      • Event Sequence Timer
      • First Time Event Tracker
      • Property Filter
      • Property Flattener
      • Schema Enforcer
      • Taxonomy Standardizer
      • Unduplicator
      • Advanced GeoIP Enricher
      • Automatic Cohort Creator
      • Currency Normalizer
      • GeoIP Enricher
      • Timestamp Parser
      • URL Normalizer
      • User Agent Populator
      • Pineapple Mode
  • Tutorials
    • Actions
    • Apps
    • Cohorts
    • Configuration
    • Data management
    • Dashboards
    • Experimentation
    • Feature flags
    • Funnels
    • Group analytics
    • Heatmaps
    • Insights
    • Path analysis
    • Retention
    • Session recording
    • Toolbar
    • Trends
  • Support
  • Glossary
  • Docs

  • Overview
    • Quickstart with PostHog Cloud
    • Overview
    • Open-Source
      • Disclaimer
      • Deployment
      • Support
    • Enterprise
      • Overview
      • Support
      • Hosting Costs
        • AWS
        • Azure
        • DigitalOcean
        • Google Cloud Platform
        • EU Hosting Companies
        • Other platforms
      • Instance settings
      • Environment variables
      • Securing PostHog
      • Monitoring with Grafana
      • Running behind a proxy
      • Configuring email
      • Helm chart configuration
      • Deploying ClickHouse using Altinity.Cloud
      • Configuring Slack
      • Overview
        • Overview
        • Upgrade notes
        • Overview
        • 0001-events-sample-by
        • 0002_events_sample_by
        • 0003_fill_person_distinct_id2
        • ClickHouse
          • Backup
          • Debug hanging / freezing process
          • Horizontal scaling (Sharding & replication)
          • Kafka Engine
          • Resize disk
          • Restore
          • Vertical scaling
        • Kafka
          • Resize disk
          • Log retention
        • PostgreSQL
          • Resize disk
          • Troubleshooting long-running migrations
        • Plugin server
          • Overview
          • Ingestion lag
          • Jobs not executing
          • Scheduled tasks not executing
        • MinIO
        • Redis
        • Zookeeper
      • Disaster recovery
    • Troubleshooting and FAQs
    • Overview
    • Ingest live data
    • Ingest historical data
    • Identify users
    • User properties
    • Using a CDP
    • Deploying a reverse proxy
    • Library comparison
    • Badge
    • Browser extensions
      • Snippet installation
      • Android
      • iOS
      • JavaScript
      • Flutter
      • React Native
      • Node.js
      • Go
      • Python
      • Rust
      • Java
      • PHP
      • Ruby
      • Elixir
      • Docusaurus v2
      • Gatsby
      • Google Tag Manager
      • Next.js
      • Nuxt.js
      • Retool
      • RudderStack
      • Segment
      • Sentry
      • Slack
      • Shopify
      • WordPress
      • Message formatting
      • Microsoft Teams
      • Slack
      • Discord
    • Migrate between PostHog instances
    • Migrate from Amplitude
    • Migrate to PostHog Cloud EU
    • To another self-hosted instance
    • Export your events
    • Overview
    • Tutorial
    • Troubleshooting
    • Developer reference
    • Using the PostHog API
    • Jobs
    • Testing
    • TypeScript types
    • Overview
    • POST-only public endpoints
    • Actions
    • Annotations
    • Cohorts
    • Dashboards
    • Event definitions
    • Events
    • Experiments
    • Feature flags
    • Funnels
    • Groups
    • Groups types
    • Insights
    • Invites
    • Members
    • Persons
    • Plugin configs
    • Plugins
    • Projects
    • Property definitions
    • Session recordings
    • Trends
    • Users
    • Data model
    • Overview
    • Data model
    • Ingestion pipeline
    • ClickHouse
    • Querying data
    • Overview
    • GDPR guidance
    • HIPAA guidance
    • CCPA guidance
    • SOC 2
    • Data egress & compliance
    • Data deletion
    • Overview
    • Code of conduct
    • Recognizing contributions
  • Using PostHog

  • Table of contents
      • Dashboards
      • Funnels
      • Group Analytics
      • Insights
      • Lifecycle
      • Path analysis
      • Retention
      • Stickiness
      • Trends
      • Heatmaps
      • Session Recording
      • Correlation Analysis
      • Experimentation
      • Feature Flags
      • Actions
      • Annotations
      • Cohorts
      • Data Management
      • Events
      • Persons
      • Sessions
      • UTM segmentation
      • Team collaboration
      • Organizations & projects
      • Settings
      • SSO & SAML
      • Toolbar
      • Notifications & alerts
    • Overview
      • Amazon Kinesis Import
      • BitBucket Release Tracker
      • Event Replicator
      • GitHub Release Tracker
      • GitHub Star Sync
      • GitLab Release Tracker
      • Heartbeat
      • Ingestion Alert
      • Email Scoring
      • n8n Connector
      • Orbit Connector
      • Redshift Import
      • Rudderstack Import
      • Segment Connector
      • Shopify Connector
      • Stripe Connector
      • Twitter Followers Tracker
      • Zendesk Connector
      • Airbyte Exporter
      • Amazon S3 Export
      • Avo Inspector
      • BigQuery Export
      • Customer.io Connector
      • Databricks Export
      • Engage Connector
      • GCP Pub/Sub Connector
      • Google Cloud Storage Export
      • Hubspot Connector
      • Intercom Connector
      • PagerDuty Connector
      • PostgreSQL Export
      • Redshift Export
      • RudderStack Export
      • Salesforce Connector
      • Sendgrid Connector
      • Sentry Connector
      • Snowflake Export
      • Twilio Connector
      • Variance Connector
      • Pace Integration
      • Zapier Connector
      • Downsampler
      • Event Sequence Timer
      • First Time Event Tracker
      • Property Filter
      • Property Flattener
      • Schema Enforcer
      • Taxonomy Standardizer
      • Unduplicator
      • Advanced GeoIP Enricher
      • Automatic Cohort Creator
      • Currency Normalizer
      • GeoIP Enricher
      • Timestamp Parser
      • URL Normalizer
      • User Agent Populator
      • Pineapple Mode
  • Tutorials
    • Actions
    • Apps
    • Cohorts
    • Configuration
    • Data management
    • Dashboards
    • Experimentation
    • Feature flags
    • Funnels
    • Group analytics
    • Heatmaps
    • Insights
    • Path analysis
    • Retention
    • Session recording
    • Toolbar
    • Trends
  • Support
  • Glossary
  • Docs
  • Migrate & export
  • To another self-hosted instance

Migrate to another self-hosted instance

Last updated: Jan 11, 2023

On this page

  • Migrating event data using the Replicator App
  • Migrating non-event data
  • Exporting Postgres Data
  • 1. Creating a data dump from your old instance
  • 2. Accessing Postgres on your new instance
  • 3. Deleting data in the new instance
  • 4. Copying your data file to the Postgres pod
  • 5. Loading the data into the new instance
  • 6. Upgrade
  • Moving a license key

Note: If you're looking for a more hands-off experience we recommend migrating to PostHog Cloud.

Migrating event data using the Replicator App

If you're attempting this migration, feel free to ask questions and provide feedback via the PostHog Communty Slack workspace or a GitHub issue. You should also be aware that some of the steps on this document are potentially destructive! Proceed with caution.

Requirements

  • Existing PostHog instance that you can login to (if self-hosting it should be running a minimum version of 1.30.0)
  • New PostHog instance

Approach

With this migration, we will migrate over your events using a PostHog app, which will create the necessary person, person distinct ID, and related records in cloud.

We will only be migrating data used to run analytical queries, such as events, persons and person distinct IDs. This means that dashboard configuration, feature flags and other data that is not used in analytical queries will need to be recreated on cloud.

Instructions

Instructions for using PostHog Replicator app:

  1. Log in to your old PostHog instance
  2. Click 'Apps' on the left-hand tool bar
  3. Search for 'Replicator'
  4. Select the app, press 'Install'
  5. For the configuration details using the following:
    • Host: the hostname of your new PostHog instance e.g. app.posthog.com if you are using US cloud, eu.posthog.com if you were are EU cloud, or the domain you'll be sending events to if you are self-hosting
    • Project API Key: the API key for the new project that you want to send events to - found in the project settings page.
  6. See events come into the new PostHog instance, identically to the originals
  7. Once the Replicator is activated, go back to the configuration menu and click the gear next to "Export historical events" under "Jobs." On the calendar, select a start and end date, then click "Run job now." This will begin the process of historical events being import (and might take a while). You can see the progress in logs or by clicking the "App metrics" button (graph) and going to the "Historical exports" tab.
  8. To track progress of the export you can click the graph icon next to the Replicator app and then click on the "Historical Exports" tab

Migrating non-event data

To migrate the non-event data (such as user information, feature flags, dashboard configurations, etc) we'll need to create a data dump and load that into the new instance. This non-event data is stored in Postgres.

This is only officially supported for importing events into a new cloud instance of PostHog or an enterprise self-hosted instance using Kubernetes. If you're using a different deployment method, you can still try this approach but we can't guarantee it will work.

  • You should have a clean enterprise self-hosted PostHog instance up and running. Your new PostHog instance should have no ingested events. We recommend using a fresh and unused installation.
  • Your old and new instances should both be running the exact same version of PostHog (minimum 1.30.0).

Exporting Postgres Data

Please note that this guide assumes your PostHog pods are running in the posthog kubernetes namespace, thus appending -n posthog to kubectl commands. Substitute posthog for the namespace applicable to you, or omit the -n flag if you're using the default namespace.

1. Creating a data dump from your old instance

Access your old PostHog instance and run the following command:

Terminal
# tip: use the flag -U to specify a username if necessary
pg_dump -d posthog -f tmp/export.sql --no-owner --data-only --disable-triggers \
-t posthog_action \
-t posthog_actionstep \
-t posthog_annotation \
-t posthog_cohort \
-t posthog_dashboard \
-t posthog_dashboarditem \
-t posthog_featureflag \
-t posthog_featureflagoverride \
-t posthog_messagingrecord \
-t posthog_organization \
-t posthog_organizationinvite \
-t posthog_organizationmembership \
-t posthog_personalapikey \
-t posthog_plugin \
-t posthog_pluginattachment \
-t posthog_pluginconfig \
-t posthog_pluginstorage \
-t posthog_team \
-t posthog_user \
-t posthog_user_groups \
-t posthog_user_user_permissions

Note: You may see a warning regarding circular dependencies between tables, which shouldn't be an issue as we are using --disable-triggers. If you do run into issues when restoring, try doing a full dump instead of --data-only.

If you do not access your Postgres database via a port-forwarding mechanism to your local machine, you will then need to copy the file from the server to your local system (or the remote server from where you access your new instance). You can do this with scp:

Note: If your old PostHog instance is hosted on Heroku, you should follow this guide to export all of your data, load into a separate database (we recommend a local Postgres instance), and then run the command above.

Terminal
scp <your_username>@<database_hostname>:/tmp/export.sql .

If your old instance was deployed using the Helm chart, you can use:

Terminal
kubectl cp <postgres-pod-name>:/tmp/export.sql ./export.sql # -n posthog

2. Accessing Postgres on your new instance

To access Postgres on your new PostHog instance (Kubernetes cluster), you should do the following:

Tip: Find out your pod names with kubectl get pods -n posthog

  1. Find out your Postgres password from the web pod:

    Terminal
    kubectl exec -n posthog -it your-posthog-web-pod \
    -- sh -c 'echo password:$POSTHOG_DB_PASSWORD'
  2. Connect to your Postgres pod:

    Terminal
    # Replace posthog-posthog-postgresql-0 with your pod's name if different
    kubectl exec -n posthog -it posthog-posthog-postgresql-0 -- sh
  3. Connect to the posthog DB:

    Terminal
    psql -d posthog -U postgres

    Postgres will ask you for the password. Use the value you found from step 1.

3. Deleting data in the new instance

In order to correctly migrate your data over, we need to make sure the tables we're importing into are completely empty. This will ensure that the foreign key mappings are set up correctly.

To do so, connect to your new Postgres instance using the command from the previous step.

Refresher: Connect to Postgres from your Postgres pod with the command psql -h your-pgbouncer-host -p 6543 -d posthog -U postgres.

Once connected, you should have access to the psql shell. To erase all data in the tables we'll be importing into, run the following:

Warning: Only run this in your new instance.

SQL
-- This command is a destructive action that leads to data deletion!!!
-- Uncomment the next line (remove '--') if you're certain you want to run it
-- TRUNCATE TABLE posthog_action,
posthog_actionstep,
posthog_annotation,
posthog_cohort,
posthog_dashboard,
posthog_dashboarditem,
posthog_featureflag,
posthog_featureflagoverride,
posthog_messagingrecord,
posthog_organization,
posthog_organizationinvite,
posthog_organizationmembership,
posthog_personalapikey,
posthog_plugin,
posthog_pluginattachment,
posthog_pluginconfig,
posthog_pluginstorage,
posthog_team,
posthog_user,
posthog_user_groups,
posthog_user_user_permissions
CASCADE;

The command output may indicate that some other tables were also truncated. This is normal and expected.

4. Copying your data file to the Postgres pod

In order to copy the data to the new Postgres instance, we must first have the data accessible in our Postgres pod, where we're able to access our instance.

To do so, run the following in the directory of your export (data dump) file:

Terminal
# Replace posthog-posthog-postgresql-0 with your pod's name if different
kubectl cp export.sql posthog-posthog-postgresql-0:/tmp/export.sql -n posthog

This will copy the export file to a tmp/ directory in the Postgres pod.

5. Loading the data into the new instance

With the export file now in the right place, we can finally import the data!

First, access your Postgres pod:

Terminal
# Replace posthog-posthog-postgresql-0 with your pod's name if different
kubectl exec -n posthog -it posthog-posthog-postgresql-0 -- sh

Now, run the following to import the data:

Terminal
psql -d posthog -U postgres < tmp/export.sql
I'm seeing some errors, what should I do?

If you see some errors on this final step, you can try to use pg_restore instead. You should follow all steps from the beginning, with two modifications:

  1. When running the pg_dump command, add the flag -Fc. Your command will end up looking something like this:
Terminal
pg_dump -Fc -d posthog -f export.sql --no-owner --data-only # ...tables
  1. On this final step, instead of psql, use pg_restore, like so:
Terminal
pg_restore -U postgres -d posthog < /tmp/export.sql

If errors persist, reach out to us on Slack.

And that's it! If you see no errors, your data should now have been migrated over.

6. Upgrade

As a final step, we should update the cluster with the following command:

Terminal
helm upgrade -f values.yaml --timeout 30m posthog posthog/posthog --atomic -n posthog

Once the command completes, you can check that all pods are healthy (status = Running) with the following command:

Terminal
kubectl get pods -n posthog

The output should look something like this:

Terminal
NAME READY STATUS RESTARTS AGE
chi-posthog-posthog-0-0-0 1/1 Running 0 11d
clickhouse-operator-6b5438eh5fb-bt5fk 2/2 Running 0 11d
posthog-beat-7782927b778-wxvhl 1/1 Running 0 11d
posthog-cert-manager-69fahs7b57-c48dn 1/1 Running 0 11d
posthog-cert-manager-cainjector-6d95d93mn8-6tz6k 1/1 Running 0 11d
posthog-cert-manager-webhook-6469930mdfc-6l958 1/1 Running 0 11d
posthog-events-55283995cc-rpjdm 1/1 Running 0 11d
posthog-ingress-nginx-controller-648bdn892f-w7qhp 1/1 Running 0 11d
posthog-pgbouncer-77fb4djs85c-2d24t 1/1 Running 0 11d
posthog-plugins-54fjd8292649-66gsm 1/1 Running 0 18m
posthog-posthog-kafka-0 1/1 Running 0 11d
posthog-posthog-postgresql-0 1/1 Running 0 11d
posthog-posthog-redis-master-0 1/1 Running 0 11d
posthog-posthog-zookeeper-0 1/1 Running 0 11d
posthog-posthog-zookeeper-1 1/1 Running 0 11d
posthog-posthog-zookeeper-2 1/1 Running 0 11d
posthog-web-78dns2f5d7-6zdkc 1/1 Running 0 11d
posthog-worker-7857nd8268-j8c4f 1/1 Running 0 11d

Moving a license key

If you have an existing license key, please email sales@posthog.com to move your license key over.

Questions?

Was this page useful?

Next article

Export events

At PostHog we believe in your right to own your data - this means being able to export the raw events whenever you want. There are several ways to export your events depending on your use case: Method When? Limitations PostHog UI - click "Export" on the events table You need to export a small number of events 3,500 events Events API Great for one-off smaller exports 1 day date range and 3,500 events Data export app You need to export a large number of events No limits If you're looking to…

Read next article

Authors

  • Eli Kinsey
    Eli Kinsey
  • Luke Harries
    Luke Harries
  • Paul Hultgren
    Paul Hultgren

Share

Jump to:

  • Migrating event data using the Replicator App
  • Migrating non-event data
  • Exporting Postgres Data
  • 1. Creating a data dump from your old instance
  • 2. Accessing Postgres on your new instance
  • 3. Deleting data in the new instance
  • 4. Copying your data file to the Postgres pod
  • 5. Loading the data into the new instance
  • 6. Upgrade
  • Moving a license key
  • Questions?
  • Product

  • Overview
  • Pricing
  • Product analytics
  • Session recording
  • A/B testing
  • Feature flags
  • Apps
  • Customer stories
  • PostHog vs...
  • Docs

  • Quickstart guide
  • Self-hosting
  • Installing PostHog
  • Building an app
  • API
  • Webhooks
  • How PostHog works
  • Data privacy
  • Using PostHog

  • Product manual
  • Apps manuals
  • Tutorials
  • Community

  • Questions?
  • Product roadmap
  • Contributors
  • Partners
  • Newsletter
  • Merch
  • PostHog FM
  • PostHog on GitHub
  • Handbook

  • Getting started
  • Company
  • Strategy
  • How we work
  • Small teams
  • People & Ops
  • Engineering
  • Product
  • Design
  • Marketing
  • Customer success
  • Company

  • About
  • Team
  • Investors
  • Press
  • Blog
  • FAQ
  • Support
  • Careers
© 2023 PostHog, Inc.
  • Code of conduct
  • Privacy policy
  • Terms