The simplest way to try out ozone is to run via Docker. Apache publishes docker images of ozone to make it easy to try out ozone.

DockerHub
docker run -d -p 9878:9878 -p 9876:9876 apache/ozone

This is an all-in-one docker container which includes all the required Ozone service, but only one datanode.

You can check the web ui at http://localhost:9878

Or or you can use AWS client:

Testing with AWS client
aws s3api --endpoint http://localhost:9878 create-bucket --bucket bucket1

As we have only one datanode we should use the REDUCED_REDUNDANCY storage class to upload a file (you don't need to use it in case of a real cluster with at least 3 datanodes)

Uploading a file using S3
date > /tmp/testfile
aws s3 --endpoint http://localhost:9878 cp --storage-class REDUCED_REDUNDANCY /tmp/testfile  s3://bucket1/testfile

Running a pseudo-cluster with multiple containers

To run a real pseudo cluster, you need a docker-compose definition. Create a new directory and save the docker-compose config and definitions:

Running Multiple Dockers
docker run apache/ozone cat docker-compose.yaml > docker-compose.yaml
docker run apache/ozone cat docker-config > docker-config

Now you can start the a multi-node cluster

Running the Mulitnode Docker Cluster
docker-compose up --scale datanode=3

At this point you have a running Ozone cluster with three data nodes. You can run standard Docker commands like ps to verify that you have  running cluster.

You can ssh into one of the running containers and try out various commands.

ssh
docker-compose exec datanode bash

You can run the  built-in load generator called freon to make sure that your cluster is fully functional.

Freon
ozone freon rk

The rk profile instructs freon to generate random keys , you can quit freon via CTRL-C any time.


Once you have seen how to run the cluster, you can explore by running more commands and when you are down shut down the docker cluster by running

Shutdown cluster
docker-compose down


Thank you for trying out Ozone.


  • No labels