Kong API Management


Kong API Management

Kong is an open-source platform that has tools for API Gateways, API Management, Governance, and Kubernetes. API Gateways provide a way to proxy traffic to different services. This is very beneficial for many reasons. By acting as a single entry point the API gateway is responsible for routing traffic. This means that if a company wanted to break apart a monolith, it’s much easier because the API gateway can point to a new source with a configuration change. Kong is a next-generation API platform specifically designed with microservices in mind.

Note: work in progress.. something is wrong with WordPress for previews and I don’t have the time to fix it.

Note: this tutorial will focus on using PowerShell with httpie.

Kong Features You Should Know About

  • Plugins for Authentication
  • Canary releases
  • Monitoring for latency, anomalies, and high error rates
  • Scales from laptop to the global cluster
  • Role-Based Access Control (RBAC)
  • Modular; create plugins in Lua
  • Infrastructure agnostic
  • Works with Kubernetes
  • Security by obscuring API origin endpoints.
  • Stable ~80k requests per second with no data loss.

Basic Representation of the Kong API Gateway

This is a very low-level basic representation of how an API gateway works. This is what you should expect running Kong locally.

Launching Kong Locally in Docker

Most of this comes off their site but I’m going to put some spins on it to demonstrate other ways of using Kong. We’ll start off using these basic commands to start a PostgreSQL server and Kong. As an alternative Kong can work with Cassandra. We’ll start with basic Docker commands to get started.


Start up the Database

Migrate Database

Starting Kong Service

Create API Routes

Add Plugins

Kong has a bunch of authentication plugins that you can install and use. This makes integrating API security a breeze.

Authentication Plugins


Kong Swag Documents

Canary Releases

With Kong doing “Canary Releases” is rather easy. Routing only 10% of the traffic to the new API microservice will allow time to evaluate whether the new API microservice is functioning as expected.