Getting Started

Installation

Titan runs on any Windows or MacOS system with Docker Desktop installed and configured to run Linux containers. If you can run docker run --rm busybox:latest echo hello, then you’re good to go. If you are using a Linux system, consult the Documentation to determine if your distribution is supported.

Once you have docker configured and running, you will need to head to the download page. When you extract the archive, you will have a single titan binary that you should place in your path (such as /usr/local/bin).

Titan for Docker

To use Titan with containers running on your local workstation, run:

$ titan install

This will download and configure the titan server container that provides the storage capabilities required for Titan. This requires connectivity to docker hub - if you are working behind a firewall consult the Documentation for more information. Part of this also involves installing ZFS on the Docker VM or Linux host. If you are using an unsupported Linux distribution, or a bleeding edge Docker Desktop build, you may find that it has to build a custom distribution and may fail outright. Reach out to the community for assistance.

Titan for Kubernetes

Titan Kubernetes support is currently in beta. For more information on how Kubernetes works, its limitations, and cluster requirements, see the Kubernetes Documentation. To install Titan for Kubernetes, run:

$ titan context install -t kubernetes

This will still require downloading and installing a titan container, but it will not require installing ZFS on the Docker VM or Linux host.

Cloning Your First Repo

We can launch a postgres database simply by cloning this demo repository:

$ titan clone -n hello-world s3web://demo.titan-data.io/hello-world/postgres

You should now be able to see a repository running:

$ titan ls
CONTAINER             STATUS
hello-world           running

If the container fails to start, it may be because you already have PosgreSQL running on port 5432, in which case you will need to stop the database and try cloning again. We can also verify the contents using psql. If you don’t have the postgres tools installed on your laptop, you can run alias psql='docker exec hello-world psql' to use the version shipped in the container.

$ psql postgres://postgres:postgres@localhost/postgres -t -c 'SELECT * FROM messages;'
 Hello, World!

If you don’t have PostgreSQL tools, you can also run the DynamoDB example:

$ titan clone -n hello-world s3web://demo.titan-data.io/hello-world/dynamodb
$ aws dynamodb scan --endpoint http://localhost:8000 --table-name messages | grep '"S"'
                "S": "Hello, World!"

Creating a New Repository

If you want to start your own database, you can do so with titan run, which run any stateful docker image with a VOLUME declaration, such as MongoDB.

$ titan run -n mongo-test mongo:latest

Committing and Checking out State

Using the mongo repository we created in the last step, let’s add some data. If you don’t have mongo installed on your laptop, you can run alias mongo="docker exec mongo-test mongo" to use the version shipped in the container.

$ mongo --quiet --eval 'db.employees.insert({firstName:"Ada",lastName:"Lovelace"})'

And commit that state:

$ titan commit -m "First Employee" mongo-test
Commit b040cfe3-aae5-42b2-a41c-6fe2e2baad1c

Now, even if we add more data:

$ mongo --quiet --eval 'db.employees.insert({firstName:"Grace",lastName:"Hopper"})'
WriteResult({ "nInserted" : 1 })
$ mongo --quiet --eval 'db.employees.find()'
{ "_id" : ObjectId("5d88d264302cca22a91cfb9a"), "firstName" : "Ada", "lastName" : "Lovelace" }
{ "_id" : ObjectId("5d88d2f897a6f34e91e46f0d"), "firstName" : "Grace", "lastName" : "Hopper" }

We can checkout or previous state:

$ titan checkout --commit b040cfe3-aae5-42b2-a41c-6fe2e2baad1c mongo-test
$ mongo --quiet --eval 'db.employees.find()'
{ "_id" : ObjectId("5d88d264302cca22a91cfb9a"), "firstName" : "Ada", "lastName" : "Lovelace" }

Additional Workflows

With just these simple tools, you can start to create and manage data state to match your workflow. Want to keep a blank database with your schema pre-applied? Want to keep some sample data available for you to do destructive testing? Have a failing test and want to keep the data state around to debug later? Titan lets developers work with the data they need, when they need it, all on their laptop environment.

For more information on pushing and pulling data from remote repositories, more complex workflows to migrate data from existing containers and filesystems, and a complete breakdown of command line features, see the documentation.

Titan is still a young community, and there are plenty of rough edges and future ideas in store. Head over to the future section to learn more about what’s in store.