© 2025 SOC Prime Inc.
All rights reserved. This product and documentation related are protected by copyright and distributed under licenses restricting their use, copying, distribution, and decompilation. No part of this product or documentation related may be reproduced in any form or by any means without the prior written authorization of SOC Prime. While every precaution has been taken in the preparation of this book, SOC Prime assumes no responsibility for errors or omissions. This publication and features described herein are subject to change without notice.
Uncoder AI is a private non-agentic AI for threat-informed detection engineering. This major Uncoder AI release 4.0 introduces a robust set of features designed to enhance how detection rules are created, translated, and optimized across the most popular technologies, acting as a game-changer for security teams to stay ahead in the evolving cybersecurity landscape. Powered by market-leading public large language models (LLMs), such as OpenAI, Gemini, DeepSeek, and Llama along with SOC Prime’s private Machine Learning models trained on the world’s largest dataset of 1,000,000+ detection rules and queries, along with 13,000+ labels and running on SOC Prime’s private cloud, Uncoder AI provides broad options for detection code mastering introduced in its latest release 4.0.
AI Tools: Detection Query Summary & Optimization
With the latest Uncoder AI release, we have significantly expanded AI capabilities to help users analyze and optimize detection code more efficiently. This enhancement enables users to generate concise summaries or detailed decision trees of their detection logic, providing deeper insights into detection rules and queries.
In the AI Tools tab, located in the upper panel of Uncoder AI, users can now access four new options for detection logic analysis and optimization:
Short Summary – Quickly grasp the intent and detection method at a glance
Full Summary – Gain a comprehensive breakdown of the detection logic, covering all aspects of the query or rule
Decision Tree – Follow the detection logic step by step to understand its exact execution
Query Optimization – Receive detailed recommendations to enhance query performance and efficiency
To use these features, simply insert your code into the left panel or select a detection rule from the Detection Rules search. Then, choose the appropriate AI Tool to analyze the code—results will be displayed automatically in the right panel.
This functionality is powered by LLMs operating in a fully user-controlled environment that complies with strict data privacy and IP protection requirements:
llama3.3:70b – A 70-billion parameter model in Llama series, known for its scalability and adaptability, making it a strong contender in AI model parameter comparison. Privately hosted by SOC Prime.
llama3.1:8b – The latest iteration in the Llama series, offering improved scalability and adaptability in AI model parameters. Privately hosted by SOC Prime.
deepseek-r1:70b – Deepseek’s first-generation reasoning model, achieving performance comparable to OpenAI’s o1, with a context window of 128K tokens. Privately hosted by SOC Prime.
mistral-large:123b – A 123-billion parameter model by Mistral AI, focusing on high performance and efficiency in natural language processing tasks. Privately hosted by SOC Prime.
gpt-4o-mini – A compact version of OpenAI’s GPT-4o, designed for enhanced reasoning capabilities with reduced computational requirements and faster response times. Publicly hosted on OpenAI’s model accessed by API.
o3-mini – OpenAI’s compact AI model, emphasizing improved reasoning capabilities at a lower cost, offering faster response times and reduced computational requirements. Publicly hosted on OpenAI’s model accessed by API.
gemini-2.0-flash – A lightweight and efficient version of Google’s Gemini 2.0, optimized for fast response times and cost-effective deployment: while maintaining strong language understanding. Publicly hosted Google’s model accessed by API.
Note: Currently, users can only choose the llama3.3:70b model, while the rest on the list will be available shortly.
Translate Functions With AI
The latest release unlocks more advanced capabilities for cross-platform detection rule translation, giving security teams the freedom to migrate, learn, and adapt to new technologies effortlessly. Even for queries with complex logic and multiple functions, Uncoder AI intelligently analyzes and translates unsupported functions using AI-powered enhancements. Leveraging Uncoder AI based on a privacy-first approach, security engineers can always stay in control of their interaction with AI and decide what exactly to send, when to send it, and whether to enable AI functionality at all.
At the initial use of AI-powered translation functionality via Uncoder AI, SOC Prime users will now see the Boost Your Translation With External AI? pop-up prompting them to allow or decline the use of OpenAI API. The pop-up displays the following text: “Only correlation functions, such as aggregation functions, are sent to OpenAI API. The core detection logic and translations are done locally at SOC Prime SOC 2 Type II private AWS cloud segment, so no sensitive data ever leaves our cloud.”
By clicking Allow, the Translate functions with toggle switch in the right-hand corner of the right Uncoder AI panel will be automatically turned ON.
By clicking Deny, the Translate functions with toggle switch will remain OFF. Still, the user can always update these settings.
To leverage AI-enriched translation of advanced functions:
Turn the Translate functions with toggle switch ON (if it was OFF).
Click Translate.
Two tabs will appear in the editor on the right:
The Sigma tab (open by default)
The target language tab
Note: When translating the detection code, navigating to the target language tab, and clicking Translate once again, users will remain on the tab of the selected target language rather than being automatically redirected to the Sigma tab.
While the AI response is being processed, a loading mode will appear that may take a couple of seconds.
When translating advanced functions with AI, the ones submitted to AI for processing are highlighted for improved readability. Hover over the function to see a tooltip with its name and its description for additional information. Clicking on a specific word in the code also highlights all its occurrences across the code.
Status Icons in Cross-Platform Translation
For a better UX experience using Uncoder AI, the right tab for the selected target language now applies an icon (if applicable) to indicate a specific translation status:
No icon when AI is not used
An icon disappears once the translation is complete.
A checkmark icon when the translation is successfully generated and complete
An exclamation mark icon if there are translation issues
Crossed-out circle icon if the translation fails at all
AI-Assisted Translation of Detection Content Available in Threat Detection Marketplace and Uncoder AI
With this release, we have generated the translation of all non-Sigma detection rules (e.g. Microsoft Sentinel, Elastic, Splunk native rules & queries) from GitHub repositories into all language formats currently supported by SOC Prime Platform.
Note: AI-powered content translations were generated only for non-Sigma rules present in the SOC Prime Platform repositories. Custom repositories remain unaffected.
Every AI-generated translation now includes a field indicating that it was produced by AI, ensuring transparency about AI-powered content generation. To enhance visibility, an AI-generated icon has been added to indicate AI-assisted translations. This icon will be displayed next to the platform name and all language format names within the chosen platform.
Note: AI-assisted translations are NOT included in Dynamic Content Lists and are NOT used for Attack Detective scans.
Debug Console
With this Uncoder AI release 4.0, we’ve added a Debug Console that displays the following:
All system errors
All backend messages, including unsupported functions and fields
Results of Warden checks
Results of the validation in ML
All translation issues
By default, the Debug Console panel opens in a semi-collapsed state only when it contains content and clears when the Translate button is clicked or the page is reloaded.
The panel header contains:
A red dot if there is content
A counter showing the number of items that need debugging (lines)
Automated Query Language Identification Powered by AI
In this release, we have improved the accuracy of automated query language detection in Uncoder AI, resulting in a more streamlined detection content translation.
Key Bug Fixes & Improvements
With this Uncoder AI release, we’ve fixed the Supercharge mode issues when in certain cases Roota was not generated at all.
