- 19 May 2025
- 11 Minutes to read
- Print
- DarkLight
System Description
- Updated on 19 May 2025
- 11 Minutes to read
- Print
- DarkLight
BluVector ATD is a high-performance, advanced threat detection and visibility platform. It uses machine learning methodologies and a variety of advanced analysis modules to find malware across an enterprise. BluVector Sensors are deployed to high-density network aggregation points to passively ingest tapped network traffic. Each BluVector Sensor seamlessly integrates with threat intelligence feeds for real-time correlation, dynamic analysis engines for offload sandbox execution, and Security Information and Event Management (SIEM) tools for a rapid incident response.
The components of the BluVector ATD platform include sensors, collectors, a Central Manager, virtual machines, and the BluVector Portal. All of the components together comprise the BluVector System. The BluVector System solves a variety of technical challenges faced by enterprise security operation centers, such as:
Advanced threat detection
Historical context of network traffic
Sophisticated content analysis
Event and telemetry correlation
Application of cyber threat intelligence
Automation and orchestration of response actions
This article details the BluVector ATD concepts, setup, and requirements in the following sections:
You may also find these other articles of the User Manual useful:
Section: System Installation
Section: System Operations
Section: System Configuration
Warning:
Proper operation of this system is dependent on careful adherence to the instructions in this User Manual and appropriate training for systems administrators.
Controlling the Entire Cyber Kill Chain
Cyberattacks can emerge in various forms and stages. You can visualize them on a cyber kill chain that illustrates how threats could enter a system and possible responses (see Figure: Cyber Kill Chain).
Fig. 2: Cyber Kill Chain
For example, an attacker might:
Observe your environment to gather specific information about potential target areas.
Find a way to get inside your network based on your vulnerabilities.
Take advantage of your vulnerable areas and inject malware to gain and maintain control.
Obtain higher privileges in your system.
Move to other areas of your network to achieve more access.
Perform activities to hide from detection.
Steal important data from your system or perform other disruptive actions.
BluVector ATD finds threats early in the kill chain as soon as they can be discovered.
Understanding the BluVector ATD System Setup
This manual describes the BluVector ATD system, along with instructions for the installation and operation of the BluVector Sensor and ATD Central Manager. The BluVector ATD system consists of the following parts:
BluVector Sensor - a physical, server appliance that provides high speed capture and analysis of network streams.
BluVector Virtual Machine - a virtual appliance that provides relatively low throughput analysis of network streams, without the need for additional physical hardware.
ATD Central Manager - a virtual or physical appliance that manages multiple BluVector Sensors and BluVector Virtual Machines.
BluVector Collector - any BluVector Sensor or BluVector Virtual Machine under the control of an ATD Central Manager.
Appliance - any BluVector Collector, a BluVector Sensor, or ATD Central Manager.
ATD GUI - provides the central Graphical User Interface (GUI) to the BluVector appliances over a secure application level network protocol.
BluVector Portal - provides a machine-to-machine, global distribution mechanism for BluVector detection products, as well as a method for sending support data to the BluVector team.
This section outlines the BluVector Sensor internal software components and architecture, as well as where to place the sensor within an enterprise network. The backbone of the BluVector Sensor is a comprehensive file analysis framework referred to as the Scalable File Analyzer (SFA). This software-based framework and the hardware components it runs on are discussed in greater detail in the sections that follow.
Refer to Section: Placing BluVector Sensors on the Network for recommendations about where to place network taps on a typical network topology.
Understanding the Hardware
The BluVector Sensor is built on standard commercial off-the-shelf (COTS) hardware. In Gen3 systems, 5 Gbps (node) systems can be packaged into a high-density chassis that can house up to four nodes. Gen4 systems are individual 1U nodes (no chassis) and available at 5G and 10G variants. The following table describes the various hardware systems available. vCPU refers to virtual CPU and accounts for hyperthreading of each physical core. Gen3 systems are no longer available for sale.
Generation | Nominal Throughput Rating | Monitoring Network Interface | Total CPU | Total RAM | Total Raw Hard Disk Space | Rack Height |
---|---|---|---|---|---|---|
Gen3 | 1 Gbps | AF_PACKET (10 Gbps physical interface) | 8 CPU | 32 GB | 2 TB | 1U |
Gen3 | 5 Gbps (node) | AF_PACKET (10 Gbps physical interface) | 88 vCPU | 384 GB | 4 TB | 2U chassis (houses up to 4 nodes) |
Gen3HD | 10 Gbps (node) | AF_PACKET (10 Gbps physical interface) | 112 vCPU | 512 GB | 15 TB | 2U chassis (houses up to 4 nodes) |
Gen4 | 5 Gbps | AF_PACKET (10 Gbps physical interface) | 96 vCPU | 384 GB | 4 TB | 1U |
Gen4 | 10 Gbps | AF_PACKET (10 Gbps physical interface) | 112 vCPU | 512 GB | 15 TB | 1U |
Placing BluVector Sensors on the Network
The ideal position for the BluVector Sensor is behind existing intrusion prevention systems, spam filters, and firewalls and in front of all workstations that exist within the internal LAN. This position is advantageous because the filtered link provides the greatest amount of exposure to traffic bound for the hosts in both the Demilitarized Zone (DMZ) and internal LAN. Figure: Example Network Topology depicts the deployment of a BluVector Sensor on a typical enterprise network.
Fig. 3: Example Network Topology
Managing BluVector Sensors through the ATD GUI
All BluVector Sensors are driven by a secure web-based ATD GUI, which is served on the IP address of the eth0 management interface. The default IP address of the system is set during the manufacturing process. You may request a particular default IP address or contact Customer Support to determine which address was assigned.
Gen3 and Gen4 hardware management interfaces have 10 Gbps physical connections.
Understanding the Network Monitoring Interfaces
The BluVector Sensor passively ingests network traffic streams via SPAN sessions or network taps. The BluVector Sensor accepts up to two (2) 10-Gb fiber channel links or up to four (4) 1-Gb Ethernet links, depending on the model.
Understanding BluVector Virtual Machines
For environments with a total bandwidth less than 250 Mbps, you may deploy a BluVector Virtual Machine rather than a physical sensor. For example, you might consider BluVector Virtual Machines for monitoring internal network traffic and Internet connections at remote or branch offices. The BluVector Sensor is available as a virtual machine or appliance built for the ESXi hypervisor. The minimum virtual machine requirements to meet a nominal performance of between 250 and 500 Mbps are:
8 vCPU cores
Minimum 32 GB of RAM
Minimum ESX Version: VMware ESXi 6.0 or later
Network Adapter: An Intel network card (with the following supported drivers: e1000e, i40e, igb, ixgbe) is required to enable the ESX to operate in bypass mode, which is needed for achieving the stated performance. The adapter must be dedicated to the ESX sensor and cannot be shared with other BluVector Virtual Machines. In the absence of an Intel network card, the device can be set in emulated driver mode, at the cost of lower performance.
Minimum drive capacity 500GB
BluVector Virtual Machines may be used to expand monitoring capabilities into the interior of the network looking at “east/west” traffic or to detect threats against smaller offices or network segments that have relatively low bandwidth connections to the Internet. Due to limited processing capacity on virtual machines and the overhead of the hypervisor, the supported ingest data rate to use the BluVector Virtual Machine is recommended as 250 Mbps or less.
Managing from an ATD Central Manager
The BluVector System may be operated from an ATD Central Manager. The ATD Central Manager provides remote, single-pane-of-glass configuration management of all BluVector Sensors (hardware or virtual appliance) within the enterprise. The ATD Central Manager must have network connectivity to each BluVector Sensor or BluVector Virtual Machine that it will control. All appliances, whether hardware or virtual, when under the control of the ATD Central Manager are referred to as BluVector Collectors. Events generated by the BluVector Sensors may be forwarded to the ATD Central Manager for review. BluVector recommends minimum BluVector Virtual Machine resource allocations of 8 vCPU cores, 32 GB of RAM and 500 GB of storage.
Understanding the ATD GUI
The ATD GUI serves as a single web interface from which network defense analysts can configure, manage, and interact with the system. The primary components of the web interface are:
Customizable dashboards, including the default Overview Dashboard (for keying in on suspicious events) and the Reports Dashboard (for gaining visibility into network operations). See Section: Using the Overview Dashboard and Section: Using the Reports Dashboard for more information.
Powerful event viewer for querying against event and content metadata. See Section: Using the Event Viewer for more information.
On-premises learning for improving and customizing classifiers and viewing statistical machine learning results by filetype. See Section: Evolving Machine Learning Engine Classifiers through On-Premises Learning and Section: Viewing Machine Learning Engine Statistics for more information.
Screen for uploading files and packet captures. See Section: Uploading Files for more information.
Configuration section for System Administrators and Lead Analysts (in a limited manner). See Section: System Configuration for more information.
System documentation for assistance in setting up and using the system, including information on using the BluVector API (see Section: Using the REST API), configuring the sensor, and more. See Section: Accessing Documentation for more information.
Refer to Section: System Configuration and Section: System Operations for complete details on configuring and operating the system.
Warning:
Failure to install BluVector in accordance with these instructions may degrade the performance of the system. Operators are solely responsible for proper installation and use of this system.
Using BluVector ATD Host and Cockpit
All BluVector appliances run on a customized Oracle Linux 8 platform. This platform provides access to system resources independent of the BluVector application. The BluVector ATD Host has a graphical user interface known as Cockpit to facilitate server management (see http://www.cockpit-project.org for more information). You can access the Cockpit interface from the ATD GUI or by navigating to <hostname>:9090
.
The BluVector ATD Host manages:
System networking configurations, such as IP address, hostname, and proxies
BluVector license
System updates
System user accounts, including remote users (LDAP)
Joining BluVector Sensors to an ATD Central Manager
System backup and restore
Custom user Docker containers
Understanding Collection and the Scalable File Analyzer
The core framework of the BluVector Sensor is an event-handling architecture referred to as the Scalable File Analyzer (SFA). The SFA receives event metadata and associated files from a variety of network protocol processors (referred to as BluVector Collectors) and API endpoints. After computing file metadata, the SFA distributes files to the appropriate analyzers. After the analysis is complete, the SFA generates syslog messages and/or passes files to post analyzers, based on user-defined routing criteria in the system configuration.
The embedded analysis and collection engines within the BluVector Sensor are:
Machine Learning Engine (MLE): Model-based static analyzer that uses machine learning to classify files as benign or malicious.
Speculative Code Execution Engine (SCE or NEMA): Machine learning and heuristics-based analyzer that detects suspicious shellcode and JavaScript.
Zeek (formerly known as Bro): Network protocol analyzer and security monitoring utility.
ClamAV: Open-source and signature-based antivirus engine.
Yara: Rule-based malware identification and classification utility.
IntelLookup: High-speed module that correlates BluVector metadata against external intelligence feeds.
hURI: High-speed URL analytic that identifies suspicious URLs.
IOCHunter: Enriches file metadata by extracting potentially interesting indicators from the file binary such as links, URLs, domains, and email addresses.
Extractor: Resubmits files embedded within archives into the SFA framework for analysis.
Suricata: Signature-based intrusion detection engine.
Host/User Information: Optional service that enriches events with host/entity and user information.
Here are the currently provided versions of key open-source security tools included in the BluVector Sensor. Any scripts or signatures manually applied to the system must be supported by the available version of the software.
Open-Source Security Tool | Available Version |
---|---|
Zeek (formerly known as Bro) | 4.0.7 |
Suricata | 5.0.8 |
ClamAV | 0.103.7 |
Yara | 4.2.3 |
Using the BluVector Portal
BluVector appliances may communicate to a global BluVector management and service delivery system known as the BluVector Portal. The BluVector Portal handles only machine-to-machine requests. The portal supports a number of BluVector services and analytic systems. The portal resides in a public cloud infrastructure that is accessible via the Internet. All communications to and from BluVector appliances and the BluVector Portal are encrypted. If you are operating a BluVector appliance in an air-gapped network or are otherwise unable to access the BluVector Portal, there are specific instructions for updates that are typically distributed via the portal (see Section: Upgrading the System in an Air-Gapped Network).
The BluVector Portal supports:
Handling Submit to BV requests that allow you to receive assessments on samples from the BluVector threat research team
Handling support bundle submissions
Distributing ClamAV signature updates
Distributing BluVector supplied Yara rules
Distributing BluVector supplied Zeek scripts
Distributing Machine Learning Engine classifiers and data bundles
Handling dynamic malware analysis requests through a cloud-based sandbox
Collecting system telemetry
Understanding System Telemetry
The BluVector Portal collects engineering data, referred to as telemetry, that is useful in improving future versions of the system. No event-specific metadata is included in the telemetry feed from a system to the portal. Telemetry data is uploaded in user configurable time frames, ranging from once an hour to once every three days. Metrics are either the latest value recorded prior to the upload or a cumulative value over the time frame between uploads. See Section: Configuring BluVector Portal for instructions on configuring system telemetry.
The feed consists of the fields shown in the table below.
Telemetry Field | Description |
---|---|
Disk Usage | Quantity of hard disk present and used |
Memory Usage | Quantity of memory used, free, buffered, and available. Also, swap space consumed. |
Event Count | Number of events generated |
ThreatRecord Count | Number of threatrecords generated |
Event Rate | Rate of event production |
Packet Processing | Number of packets received and dropped (in and outbound) at the network interface card |
Bytes Received | Number of bytes received at the network interface card |
Event Type | Number of each event type generated |
Process Counts and Status | Number of active Zeek worker processes. Zeek and Suricata process status. |
CPU Utilization | Percent of CPU resources being consumed |
System Uptime | Time since last system reboot |
BV Health | List of BV Health alerts by alert identifier |
Events By Output | Number of events sent to each configured output (seen and failed) |
Flags By Analyzer | Number of events or files flagged by each analyzer |
Events By Meta.App | Number of events for each seen meta.app (eml, http, etc.) |