DE
- Industries
- Finance
Nearshore software development for finance—secure, scalable, and compliant solutions for banking, payments, and APIs.
- Retail
Retail software development services—e-commerce, POS, logistics, and AI-driven personalization from nearshore engineering teams.
- Manufacturing
Nearshore manufacturing software development—ERP systems, IoT platforms, and automation tools to optimize industrial operations.
- Finance
- What we do
- Services
- Software modernization services
- Cloud solutions
- AI – Artificial intelligence
- Idea validation & Product development services
- Digital solutions
- Integration for digital ecosystems
- A11y – Accessibility
- QA – Test development
- Technologies
- Front-end
- Back-end
- DevOps & CI/CD
- Cloud
- Mobile
- Collaboration models
- Collaboration models
Explore collaboration models customized to your specific needs: Complete nearshoring teams, Local heroes from partners with the nearshoring team, or Mixed tech teams with partners.
- Way of work
Through close collaboration with your business, we create customized solutions aligned with your specific requirements, resulting in sustainable outcomes.
- Collaboration models
- Services
- About Us
- Who we are
We are a full-service nearshoring provider for digital software products, uniquely positioned as a high-quality partner with native-speaking local experts, perfectly aligned with your business needs.
- Meet our team
ProductDock’s experienced team proficient in modern technologies and tools, boasts 15 years of successful projects, collaborating with prominent companies.
- Our locations
We are ProductDock, a full-service nearshoring provider for digital software products, headquartered in Berlin, with engineering hubs in Lisbon, Novi Sad, Banja Luka, and Doboj.
- Why nearshoring
Elevate your business efficiently with our premium full-service software development services that blend nearshore and local expertise to support you throughout your digital product journey.
- Who we are
- Our work
- Career
- Life at ProductDock
We’re all about fostering teamwork, creativity, and empowerment within our team of over 120 incredibly talented experts in modern technologies.
- Open positions
Do you enjoy working on exciting projects and feel rewarded when those efforts are successful? If so, we’d like you to join our team.
- Candidate info guide
How we choose our crew members? We think of you as a member of our crew. We are happy to share our process with you!
- Life at ProductDock
- Newsroom
- News
Stay engaged with our most recent updates and releases, ensuring you are always up-to-date with the latest developments in the dynamic world of ProductDock.
- Events
Expand your expertise through networking with like-minded individuals and engaging in knowledge-sharing sessions at our upcoming events.
- News
- Blog
- Get in touch
24. Oct 2025 •1 minute read
RAG and NLQ in Generative AI: How natural language queries transform data analytics
Saša Mehmedagić
Software Engineer
At a recent FridayTalk, our software engineer, Saša Mehmedagić, presented an overview of an innovative AI Assistant developed and successfully deployed for one of our clients.
Saša demonstrated how Retrieval-Augmented Generation (RAG) and Natural Language Query (NLQ) in Generative AI empower users to perform data analytics with natural language and conversational inputs.
The technology behind the AI Assistant
At the beginning, Saša detailed the technical architecture, explaining how the feature translates a user’s question into a secure and executable database query. The core of the system is built on a serverless Generative AI architecture on AWS Lambda, powered by Anthropic’s Claude models via Amazon Bedrock.
Outlining a multi-step flow, Saša demonstrated how a user’s question is first converted into a structured JSON query plan, which is then validated and executed securely by the backend. The retrieved data is subsequently passed to a second language model to generate a concise, formatted, and user-friendly answer.
Overcoming prompt engineering and security challenges
Furthermore, Saša also discussed the key prompt engineering challenges in Generative AI, encountered during development, including resolving ambiguity in user requests (such as relative timeframes), enforcing domain-specific business logic, and implementing essential security guardrails to protect data privacy and prevent unauthorized actions.
Simplifying integration through MCP
Looking toward the future, our software engineer introduced the Model Context Protocol (MCP), framing it as a potential “USB-C for AI.” He explained MCP as one of the emerging AI integration standards that simplifies how models connect to tools and data sources.
This talk shows how combining RAG and NLQ in Generative AI can revolutionize data analytics and AI integration.
Explore more of our projects, stories, and tech insights on our Blog.
Tags:Skip tags
Saša Mehmedagić
Software EngineerSaša Mehmedagić is a full-stack developer specializing in TypeScript frameworks, such as Angular and React, as well as Node.js and Serverless backend solutions. AWS-certified and cloud-native by design, he builds scalable systems with a strong focus on performance and maintainability.
With years of experience in the automotive industry, Saša contributes to both the architecture and implementation of robust platforms. His clean code mindset, DevOps know-how, and end-to-end expertise make him a valuable asset to the team.