Beyond Sentiment: 5 Powerful NLP Features You Can Build with Java, Quarkus & a Local LLM
From summarization to translation: Unlock practical AI features in your existing Java stack using Langchain4j and Quarkus.
If you've got a local LLM running in Quarkus (thanks to Ollama + Langchain4j) and you've built a sentiment analyzer. But don't stop there. This article shows you five other hands-on NLP tasks you can implement with minimal effort: text summarization, question answering, text generation, keyword/entity extraction, and translation. Each one is practical, production-aware, and a great way to get more from your model investment.
Introduction: So You’ve Built a Sentiment Analyzer… Now What?
Let’s say you've completed your first Langchain4j-powered feature: a sentiment analyzer running in Quarkus against a local model like Llama 3 or Mistral. Great job. But that’s just the beginning.
Local LLMs aren’t toys, they’re capable engines that can solve meaningful business problems. With the same tools and stack you’ve already set up, you can build intelligent features like document summarization, Q&A, content generation, translation, and structured extraction.
This tutorial walks through five practical use cases with code examples in Java using Quarkus and Langchain4j.
If you're new to the stack, check out my sentiment analysis tutorial first. Otherwise, let’s dive in.
Text Summarization
Use case: Automatically condense long input into a digestible format.
Common examples:
Summarizing customer reviews or support tickets.
Generating executive summaries from reports.
Creating social media abstracts from blog posts.
How It Works
You craft a simple Langchain4j service that passes a summary request to your LLM.
@RegisterAiService
public interface SummarizationService {
@SystemMessage("You are an AI assistant that summarizes text. Provide a concise summary of the given text.")
@UserMessage("Summarize the following text: {textToSummarize}")
String summarize(String textToSummarize);
}
Now inject and use it from your application service:
@ApplicationScoped
public class MyTextProcessingService {
@Inject
SummarizationService summarizationService;
public String getSummary(String longText) {
return summarizationService.summarize(longText);
}
}
Example:
String article = "...long text...";
String summary = myTextProcessingService.getSummary(article);
System.out.println("Summary: " + summary);
Question Answering (Q&A)
Use case: Ask a question about a document or context snippet and get a focused answer.
Business value:
Build an FAQ chatbot from PDFs or documentation.
Answer product questions from manuals or descriptions.
Extract specific legal or regulatory information.
Implementation
@RegisterAiService
public interface QAService {
@SystemMessage("You are an AI assistant. Answer the question based ONLY on the provided context. If the answer is not in the context, say 'I don't have enough information to answer'.")
@UserMessage("""
Context:
---
{context}
---
Question: {question}
""")
String answerQuestion(String context, String question);
}
Usage:
String doc = "...product manual...";
String answer = myQAService.ask(doc, "What battery does it use?");
This is a baby step toward Retrieval Augmented Generation (RAG). If you want RAG at scale, start exploring Langchain4j’s retriever and embedding tools.
Text Generation
Use case: Automate text creation tasks like writing emails, marketing copy, or product descriptions.
Common examples:
Generate boilerplate responses for support.
Create variant descriptions for e-commerce listings.
Assist in brainstorming or drafting.
Code
@RegisterAiService
public interface TextGenerationService {
@SystemMessage("You are a helpful AI assistant designed to generate text based on user requests.")
@UserMessage("Generate a {textType} based on the following input: {input}")
String generateText(String textType, String input);
@UserMessage("Draft a polite follow-up email to a customer about their recent inquiry regarding order {orderId}. Mention that we are looking into it and will update them within 2 business days.")
String draftFollowUpEmail(String orderId);
}
Example usage:
String features = "Ultra-lightweight, 12-hour battery, touch screen";
String copy = myContentCreationService.generateProductDescription(features);
String email = myContentCreationService.generateEmailReply("ORDER-4567");
This is ideal for internal tools that help non-technical teams work faster—without ChatGPT tabs.
Keyword and Entity Extraction
Use case: Pull important information from unstructured text.
Real-world uses:
Extract names, places, and orgs from articles.
Detect topics or categories for support tickets.
Power metadata tagging or search indexing.
Service
@RegisterAiService
public interface ExtractionService {
@SystemMessage("You are an AI assistant that extracts information from text.")
@UserMessage("Extract the main keywords from the following text: {text}")
String extractKeywords(String text);
@UserMessage("Extract all person names and organizations mentioned in the following text: {text}")
String extractEntities(String text);
}
You can get fancy and return JSON, but basic string parsing often suffices for internal use.
Example:
String article = "...some PR release...";
String keywords = myInformationExtractor.getKeywords(article);
String entities = myInformationExtractor.getEntities(article);
Tip: Good prompt design helps the LLM structure the output consistently for downstream parsing.
Simple Translation
Use case: Translate text between languages using the same local model.
Caveats:
Translation quality varies heavily across models.
Some smaller local LLMs simply aren’t trained on multilingual corpora.
Implementation
@RegisterAiService
public interface TranslationService {
@SystemMessage("You are an AI assistant that translates text accurately.")
@UserMessage("Translate the following text from {sourceLanguage} to {targetLanguage}: {textToTranslate}")
String translate(@V("textToTranslate") String text,
@V("sourceLanguage") String sourceLang,
@V("targetLanguage") String targetLang);
}
In code:
String result = myTranslationUtil.doTranslate("Bonjour le monde!", "French", "English");
Before production:
Manually test edge cases and domain-specific phrases.
Consider fine-tuned translation models if accuracy is critical.
Go Beyond the Obvious
With a local LLM, Quarkus, and Langchain4j, you’ve got a real NLP platform and not just a toy demo. All five use cases in this article (and sentiment analysis before them) are production-grade tasks. The pattern is simple: clear interfaces, structured prompts, and tight integration with your existing Java services.
What to Do Next?
Try out new models via Ollama (e.g., Phi-3, Llama 3)
Experiment with embeddings and vector stores
Add output validation, error handling, and streaming support
Consider combining these tools into a full workflow agent
Go through the full Langchain4j and Quarkus workshop
You don’t need OpenAI or Hugging Face APIs to do serious NLP work. You just need Quarkus, Langchain4j, and a local model. Now go build something real.