In this tutorial, we will show you how to install and set Apache Kafka on a VPS running Ubuntu 18.04.
To install Apache Kafka on Mac, Java is the only prerequisite. First we shall look into the installation steps of Java and then we shall setup Apache Kafka and run it on the Mac. Download kafka from this link; Extract Twice: First: kafka2.12-2.5.0.tgz to kafka2.12-2.5.0.tar using 7zip / any other tool. Second: kafka2.12-2.5.0.tar to actual folder content; Is this looking similar to this? Let’s verify the installation. Start zookeeper; Verify zookeeper status; Start Kafka broker; Verify kafka status; Default kafka.
Kafka or Apache Kafka is a distributed messaging system based on the principle of the pub-sub (publish-subscribe) model. It allows us to publish and subscribe to a stream of records that can be categorized. It is an incredibly fast, highly scalable, fault-tolerant system, and it’s designed to process large amounts of data in real time. Apache Kafka can be used as an alternative to a message broker as well, which allows us to process/transform a stream of records. Kafka can be used as a messaging system, but in a rather incomparably huge scale. Overall, Apache Kafka is a very powerful tool when used correctly.
Prerequisites
- A Server running Ubuntu 18.04 with at least 4GB of memory. For the purposes of this tutorial, we’ll be using one of our Managed Ubuntu 18.04 VPSes.
- SSH access with root privileges, or access to the “root” user itself
Step 1: Log in via SSH and Update the System
Log in to your Ubuntu 18.04 VPS with SSH as the root user:
Download Kafka Tool
Replace “root” with a user that has sudo privileges if necessary. Additionally, replace “IP_Address” and “Port_Number” with your server’s respective IP address and SSH port.
Once that is done, you can check whether you have the proper Ubuntu version installed on your server with the following command:
Kafka On Windows
You should get this output:
Then, run the following command to make sure that all installed packages on the server are updated to their latest available versions:
Step 2: Add a System User
Let’s create a new user called ‘kafka’, after which we will add this new user as a sudoer.
Step 3: Install Java
Kafka is written in Java, so a JVM is required to get it working. In this tutorial, we will use OpenJDK 11, as it is the standard version of Java that comes with Ubuntu since September 2018.
Step 4: Download Apache Kafka
Now let’s download Kafka, you can go to here and download the latest release if necessary. The latest download link at the time of writing has already been entered in the example for you.
Now that the Apache Kafka binary has been downloaded, now we need to extract it in our Kafka user directory
Step 5: Configure Apache Kafka
It is time to configure Apache Kafka. By default, we are not allowed to delete topics, categories or groups in which messages can be posted. To change this behavior, we need to edit the default configuration.
Append the following line to the last line of the configuration file.
delete.topic.enable = true
Step 6: Create a System Unit File for Apache Kafka
Zookeeper is required for running Kafka. Kafka uses zookeeper, so we’ll need to first start an instance of the Zookeeper server prior to starting the Apache Kafka service. In this tutorial, we will use the convenience script packaged with Kafka to get a quick-and-dirty single-node Zookeeper instance.
Open a new file at the filepath /etc/systemd/system/zookeeper.service
, and open it in your preferred text editor. We’ll be using nano
for this tutorial.
Paste the following lines into it:
Now, let’s create a system unit file for kafka at the filepath /etc/systemd/system/kafka.service
:
Paste the following lines into the file:
The new system units have been added, so let’s enable Apache Kafka to automatically run on boot, and then run the service.
Step 7: Create a Topic
In this step, we will create a topic named “FirstTopic”, with a single partition and only one replica:
The replication-factor value describes how many copies of data will be created. We are running with a single instance, so the value would be 1.
The partitions value describe the number of brokers you want your data to be split between. We are running with a single broker, so the value would be 1.
Now you can see the created topic on Kafka by running the list topic command:
Step 8: Send Messages using Apache Kafka
Apache Kafka comes with a command line client that will take input from a file or standard input and send it out as messages to the Kafka cluster. The “producer” is the process that has responsibility for putting data into our Kafka service. By default, Kafka sends each line as a separate message.
Access for mac free. Let’s run the producer and then type a few messages into the console to send to the server.
Keep the terminal opened, and let’s proceed to the next step.
Step 9: Use Apache Kafka as a Consumer
Apache Kafka also has a command line for the consumer to read data from Kafka – this is so that the consumer can use Kafka to display messages in a standard output.
Run the following command in a new SSH session.
That’s it! Apache Kafka has been successfully installed and set up. Now we can type some messages on the producer terminal as stated in the previous step. The messages will be immediately visible on our consumer terminal.
Install Kafka
Of course, you don’t have to know how to install Apache Kafka on Ubuntu 18.04 if you have an Ubuntu 18.04 VPS hosted with us. If you do, you can simply ask our support team to install Apache Kafka on Ubuntu 18.04 for you. They are available 24/7 and will be able to help you with the installation of Apache Kafka, as well as any additional requirements that you may have.
Mac Install Kafka
PS. If you enjoy reading this blog post on how to install Apache Kafka on Ubuntu 18.04, feel free to share it on social networks by using the shortcuts below, or simply leave a comment down in the comments section. Install windows 7 toshiba portege m400. Thank you.