Skip to main content

Nussknacker Cloud

To go through this Quickstart, you need to already have access to Nussakncer Cloud. If you don't, please go to nussknacker.io, to get an account.

This quickstart describes how to create your first scenario in Nussknacker Cloud and process the first few events.

info

Every Nu Cloud account has an example HTTP instance of Nussknacker, with an example scenario deployed. Feel free to explore and modify it or proceed with this tutorial if you want to learn how to create it from scratch.

Input and output streams

Streams of events are the input for and output from Nussknacker. There are two ways you can connect Nu Cloud to your data:

  • over HTTP, by sending events to Nu Cloud's HTTP endpoints and subscribing to event sources to receive output events on your endpoints (internally this HTTP API is backed by Kafka and Schema Registry);
  • by letting Nu Cloud connect directly to your Kafka and Schema Registry to read events from and write to your topics.

Each Nu Cloud environment is set up to work in one of those ways. Depending on your environment's setup, steps to process your first events are a little different.

HTTP

Start with logging into your environment.

Defining topics

  • Go to the Topics tab (<your environment>/topics)
  • First create an input topic. Click Add topic, name it 'transactions' and use the following avro schema:
transactions avro schema (click to expand)
{
"name": "transactions",
"namespace": "io.nussknacker.cloud",
"type": "record",
"fields": [
{
"name": "amount",
"type": "int"
},
{
"name": "client",
"type": {
"fields": [
{
"name": "id",
"type": "string"
},
{
"name": "category",
"type": "string"
}
],
"name": "clientRecord",
"type": "record"
}
}
]
}
  • Then create two output topics: 'alerts' and 'auditRequests'. Use the following avro schemas, respectively:
alerts avro schema (click to expand)
{
"name": "alerts",
"namespace": "io.nussknacker.cloud",
"type": "record",
"fields": [
{
"name": "message",
"type": "string"
}
]
}
auditRequests avro schema (click to expand)
{
"name": "auditRequests",
"namespace": "io.nussknacker.cloud",
"type": "record",
"fields": [
{
"name": "clientId",
"type": "string"
},
{
"name": "reason",
"type": "string"
}
]
}

Defining and deploying a new scenario

  • Go to the Scenarios tab (<your environment>/scenarios/)
  • Click New scenario, name it 'DetectLargeTransactions'.
  • Click Import on right panel, upload this scenario and click Save.
  • Take a look at the imported scenario:
    • the first node is a source which reads from the 'http.transactions' topic ('http' indicates that you write to and receive events from the topic over HTTP, instead of connecting directly to the underlying Kafka);
    • then small transactions are filtered out and, based on the client category, a decision is made what to do with a large transaction;
    • for standard clients an alarm is raised (by sending an event to the 'http.alarms topic') and for other client the transaction is sent for further inspection (an event to the 'http.auditRequests' topic)
  • Click Deploy on the right panel.
  • Wait until the scenario status says the scenario is running.

Adding events subscription

To receive output events from Nu Cloud, you have to subscribe to the output topics your scenarios use.

You need an endpoint (or endpoints), that will receive all the events sent to these topics. For the purpose of this quickstart you can create a mock endpoint, using one of many tools available online (webhook, mockbin etc.).

  • Go to the Topics tab (<your environment>/topics)
  • Open the 'alerts' topic and click Add subscription. Choose any name that will make it easy to identify it in the future and set the endpoint that will receive the events. After you've created a subscription it will take a moment to become active.
  • Do the same with the 'auditRequests' topic - add subscription to receive events.

Processing events

  • Go to the Topics tab (<your environment>/topics)
  • Open the 'transactions' topic and copy the URL next to the topic's name.
  • Send the following request to this URL (use the publisher password set when the environment was created): curl -H "Content-Type: application/json" -u publisher:<PUBLISHER PASSWORD> <topic URL> -d '{ "amount": 21, "client": { "id": "1234", "category": "STANDARD" } }'
  • You should get a request on the endpoint defined in the 'alerts' topic subscription.
  • You can also try sending one more request: curl -H "Content-Type: application/json" -u publisher:<PUBLISHER PASSWORD> <topic URL> -d '{ "amount": 210, "client": { "id": "4321", "category": "VIP" } }'
  • You should get a request on the endpoint defined in the 'auditRequests' topic subscription.

Kafka

Start with logging into your environment.

At this point your Nu Cloud environment should already be connected to your Kafka and Schema Registry. (Configuration is described here)

Defining topics

  • Create 3 topics in your Kafka and in your Schema Registry (please check if your Schema Registry requires you to add '-value' suffix manually), with the following JSON schemas (Nussknacker can also handle AVRO schemas, we're using JSON schemas to make it easier to send messages, if you want to use AVRO you can use schemas from HTTP setup above):
transactions JSON schema (click to expand)
{
"$schema": "http://json-schema.org/draft-07/schema",
"type": "object",
"properties": {
"amount": { "type": "integer" },
"client": {
"type": "object",
"properties": {
"id": { "type": "string" },
"category": { "type": "string" }
},
"required": ["id", "category"]
}
},
"required": ["client", "amount"],
"additionalProperties": false
}
alerts JSON schema (click to expand)
{
"$schema": "http://json-schema.org/draft-07/schema",
"type": "object",
"properties": {
"message": { "type": "string" }
},
"required": ["message"],
"additionalProperties": false
}
auditRequests JSON schema (click to expand)
{
"$schema": "http://json-schema.org/draft-07/schema",
"type": "object",
"properties": {
"clientId": { "type": "string" },
"reason": { "type": "string" }
},
"required": ["clientId", "reason"],
"additionalProperties": false
}

Defining a new scenario

  • Go to the Scenarios tab (<your environment>/scenarios/)
  • Click New scenario, name it 'DetectLargeTransactions'.
  • Click Import on right panel, upload this scenario and click Save.
  • Take a look at the imported scenario:
    • the first node is a source which reads from the 'transactions' topic;
    • then small transactions are filtered out and, based on the client category, a decision is made what to do with a large transaction;
    • for standard clients an alarm is raised (by sending an event to the 'alarms topic') and for other client the transaction is sent for further inspection (an event to the 'auditRequests' topic)
  • Click Deploy on the right panel.
  • Wait until the scenario status says the scenario is running.

Processing events

  • Send the following json message to the 'transactions' topic on your Kafka. { "amount": 21, "client": { "id": "1234", "category": "STANDARD" } }
  • You should get a message on the 'alerts' topic.
  • You can also try sending one more message: { "amount": 210, "client": { "id": "4321", "category": "VIP" } }
  • You should get a message on your 'auditRequests' topic.

Next steps

If you want to know more about creating scenarios, see Scenario authoring. Please be aware that Nu Cloud is using the Lite engine in Streaming Processing Mode. If you want to learn more, refer to Engines comparison.