Getting started with Apache Camel using SpringBoot

Author: Geym

Sep. 02, 2024

16

0

0

Getting started with Apache Camel using SpringBoot

Getting started with Apache Camel using SpringBoot

CAMEL are exported all over the world and different industries with quality first. Our belief is to provide our customers with more and better high value-added products. Let's create a better future together.

kiarash shamaii

·

Follow

4 min read

·

Jun 15,

--

Apache Camel is a very useful library that helps you process events or messages from many different sources. You may move these messages through many different protocols such as between VM, HTTP, FTP, JMS, or even DIRECTORY/FILE, and yet still keep your processing code free of transport logic. This allows you to concentrate on digesting the content of the messages instead.

Camel support for Spring Boot provides auto-configuration of the Camel and starters for many Camel components. The opinionated auto-configuration of the Camel context auto-detects Camel routes available in the Spring context and registers the key Camel utilities (such as producer template, consumer template and the type converter) as beans.

To get started, you must add the Camel Spring Boot BOM to your base Modular or Microservices Maven pom.xml file.

<dependencyManagement>

<dependencies>
<!-- Camel BOM -->
<dependency>
<groupId>org.apache.camel.springboot</groupId>
<artifactId>camel-spring-boot-bom</artifactId>
<version>3.18.3.redhat-</version>
<type>pom</type>
<scope>import</scope>
</dependency>
<!-- ... other BOMs or dependencies ... -->
</dependencies>

</dependencyManagement>

Camel Spring Boot BOM vs Camel Spring Boot Dependencies BOM

The curated camel-spring-boot-dependencies BOM, which is generated, contains the adjusted JARs that both Spring Boot and Apache Camel use to avoid any conflicts. This BOM is used to test camel-spring-boot itself.

Spring Boot users may choose to use pure Camel dependencies by using the camel-spring-boot-bom that only has the Camel starter JARs as managed dependencies. However, this may lead to a classpath conflict if a third-party JAR from Spring Boot is not compatible with a particular Camel component.

After that for demonstrate the functionality of Camel add a messaging solution like Kafka , RabbitMQ , &#;

<!-- Camel Starter -->
<dependency>
<groupId>org.apache.camel.springboot</groupId>
<artifactId>camel-spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.apache.camel.springboot</groupId>
<artifactId>camel-spring-boot</artifactId>
</dependency>
<dependency>
<groupId>org.apache.camel.springboot</groupId>
<artifactId>camel-kafka-starter</artifactId>
</dependency>

spring.h2.console.enabled=true

spring.h2.console.path=/h2

spring.kafka.bootstrap-servers=localhost:
spring.kafka.consumer.group-id=products-group
spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer
spring.kafka.producer.value-serializer=org.springframework.kafka.support.serializer.JsonSerializer

Want more information on camel auto? Feel free to contact us.

Explore more:
Battery Capacity, Longevity, and Manufacturer Reserves

Adding Camel routes

Camel routes are detected in the Spring application context, for example a route annotated with org.springframework.stereotype.Component will be loaded, added to the Camel context and run if you want.

import org.apache.camel.builder.RouteBuilder;
import org.springframework.stereotype.Component;

@Component
public class MyRoute extends RouteBuilder {

@Override
public void configure() throws Exception {
from("...")
.to("...");
}

}

Spring Boot Auto-configuration

The most important piece of functionality provided by the Camel auto-configuration is the CamelContext instance. Camel auto-configuration creates a SpringCamelContext for you and takes care of the proper initialization and shutdown of that context. The created Camel context is also registered in the Spring application context (under the camelContext bean name), so you can access it like any other Spring bean

@Configuration
public class MyAppConfig {

@Autowired
CamelContext camelContext;

@Bean
MyService myService() {
return new DefaultMyService(camelContext);
}

}

Besides, you can configure the list of Brokers in your configuration file (f.e. application.properties) through the following properties:

camel.component.kafka.brokers =localhost:

first check that borker work properly with sample class

@Component
public class SampelLoggerOfCamel extends RouteBuilder {

@Override
public void configure() throws Exception {
// Kafka Producer
from("timer://foo?period=")
.to("kafka:myTopic?brokers=localhost:");

// Kafka Consumer
from("kafka:myTopic?brokers=localhost:")
.log("Message received from Kafka : ${body}")
.log(" on the topic ${headers[kafka.TOPIC]}")
.log(" on the partition ${headers[kafka.PARTITION]}")
.log(" with the offset ${headers[kafka.OFFSET]}")
.log(" with the key ${headers[kafka.KEY]}");
}

}

put this class in a module and run your application you must get this log in your console :

-06-15 13:30:03.898 INFO --- [nsumer[myTopic]] route2 : on the topic myTopic
-06-15 13:30:03.898 INFO --- [nsumer[myTopic]] route2 : on the partition 0
-06-15 13:30:03.898 INFO --- [nsumer[myTopic]] route2 : with the offset 64
-06-15 13:30:03.899 INFO --- [nsumer[myTopic]] route2 : with the key
-06-15 13:30:04.898 INFO --- [nsumer[myTopic]] route2 : Message received from Kafka :
-06-15 13:30:04.898 INFO --- [nsumer[myTopic]] route2 : on the topic myTopic
-06-15 13:30:04.898 INFO --- [nsumer[myTopic]] route2 : on the partition 0
-06-15 13:30:04.898 INFO --- [nsumer[myTopic]] route2 : with the offset 65
-06-15 13:30:04.898 INFO --- [nsumer[myTopic]] route2

Now try real project :

Camel auto-configuration provides pre-configured ConsumerTemplate and ProducerTemplate instances. You can simply inject them into your Spring-managed beans.

In this project there are two module which must have sync data as CQRS design pattern . for sync data need a messaging solution ( Kafka use in this sample).

but there is a problem if this messaging solution accidently not available make to module lost their Consistency of data in this situation Apache Camel comes which store event up to send it successfully .

//in module 1
@Service
public class ProductService {
private final ProductRepository repository;
private final KafkaTemplate<String, ProductEvent> kafkaTemplate;
private final ProducerTemplate producerTemplate;

public ProductService(ProductRepository repository,
KafkaTemplate<String, ProductEvent> kafkaTemplate,
ProducerTemplate producerTemplate) {
this.repository = repository;
this.kafkaTemplate = kafkaTemplate;
this.producerTemplate = producerTemplate;
}
public Product add(Product p) throws JsonProcessingException {
var product = repository.save(p);
ProductEvent event = new ProductEvent("ProductCreated", product);
// Creating Object of ObjectMapper define in Jackson API
ObjectMapper Obj = new ObjectMapper();
// Converting the Java object into a JSON string
String jsonEvent = Obj.writeValueAsString(event);
// Displaying Java object into a JSON string
System.out.println(jsonEvent);


this.producerTemplate.asyncRequestBody("kafka:products?brokers=localhost:",
jsonEvent);
//just for test kafka
// this.kafkaTemplate.send("products", event);
return product;
}
}


//module 2
@Service
public class ProductService {
private final ProductRepository repository;

public ProductService(ProductRepository repository) {
this.repository = repository;
}
public List<Product> getAll(){
return repository.findAll().stream()

.collect(Collectors.toList());
}

@KafkaListener(topics = "products", groupId = "products1_group")
public void processProductEvent(String event) {

System.out.println("Getting event " + event);

ProductEvent productEvent = null;
try {
productEvent = new ObjectMapper().readValue(event, ProductEvent.class);

System.out.println(productEvent);

switch (productEvent.getType()) {
case "ProductCreated":

this.repository.save(productEvent.getProduct());
break;

default:
break;
}

} catch (Exception e) {

e.printStackTrace();
}
}

}

Scenario 1:

Run Kafka server and Zookeeper after that run both module add a product after you can see the result in h2 of two module .

Scenario 2:

Just run two module add a product now when check module 2 nothing happen because Kafka is down and event doesn't process now run Kafka

Apache Camel send event properly and problem solved.

resources :

https://www.masterspringboot.com/camel/camel-kafka-with-spring-boot-example/

For more information, please visit agm car battery technology.

Comments

Please Join Us to post.

0

0/2000

Guest Posts

If you are interested in sending in a Guest Blogger Submission,welcome to write for us.

Your Name: (required)

Your Email: (required)

Subject:

Your Message: (required)

0/2000