colored-bar

Packet De-duplication

Remove duplicate packets to boost network monitoring accuracy and security tool effectiveness.

Your Packets Need De-Duplication

How many duplicate packets are weighing down your systems? Packet duplication occurs when multiple tap and aggregation points, deployed throughout the datacenter to acquire necessary traffic, result in redundant copies of packets. Duplicates can also be caused by inter-VLAN communication, incorrect switch policy settings or unavoidable SPAN/mirror port configurations.

Unfortunately, these additional copies can well exceed 80 percent of traffic volume seen on your network. This takes up vital monitoring infrastructure bandwidth and tool computing power. As a result tools get overwhelmed causing them to lose effectiveness, which results in inaccurate analysis.

Problems that arise from duplicated packets include :

  • Misinterpretation by security tools, resulting in false positives
  • Improper performance diagnosis due to artificially elevated packet and byte counts
  • Reduced data retention periods on forensic recorders due to added storage burdens
  • Inaccurate flow data in NetFlow/IPFIX reports

Eliminate duplicate packets to reduce traffic sent to network and security tools by 50 percent.

Gigamon Solution — Duplicate Packet Removal with GigaSMART

Stop Overloading Your Tools with Identical Packets

What is packet de-duplication? De-duplication is a technique whereby duplicate packets are identified and dropped, while unique packets are left untouched.

The GigaSMART® De-duplication application targets, identifies, and eliminates duplicate packets, blocking unnecessary duplication and sending optimized flows to your security and network monitoring tools. GigaSMART offloads the de-duplication task from these tools, letting you centralize this function while providing multiple tools with the same feed.

GigaSMART De-duplication benefits include:

  • Enjoy significantly improved tool effectiveness and accuracy
  • Experience increased traffic volume analysis without swelling tool expenditure
  • Gain better security analytics and reduce false positive results
  • Get faster, more accurate forensics and malware detection
  • Reduce tool storage capacity requirements

GigaSMART De-duplication is accurate and customizable. It lets you tune duplicate detection to improve accuracy and effectiveness. For example, you can specify whether two packets that are identical except for specific headers or fields (for example, MAC address, IP TOS or TCP Sequence number) are considered duplicates. In addition, the duplicate detection window is configurable to match your network traffic acquisition deployment.

The application also supports service chaining so that you can direct the de-duplicated traffic stream to multiple GigaSMART applications in succession, such as to the NetFlow generation and export app.

Gigasmart

TRAINING


GigaSMART Overview

Data de-duplication allows you to unburden your network from the weight of duplicate packets so it can run at hyper speed. To learn more, check out our GigaSMART packet De-duplication training video.

Gigamon Community

Get the latest technical content and participate in discussions in the Gigamon VÜE Community.

GigaSMART Features

GigaSMART® offers several essential traffic intelligence services required for deep observability into infrastructure blind spots.

NetFlow Generation

Delivers basic Layer 2–4 network traffic data to analysis tools.

Source Port Labeling

Provides context to packets and identifies behaviors and threats based on network location.

Adaptive Packet Filtering

Identifies patterns across any part of the network packet, including the packet payload.

De-duplication

Targets, identifies, and eliminates duplicate packets, blocking unnecessary duplication and sending optimized flows.

Packet Slicing

Truncates packets while preserving the protocol headers required for network analysis.

Advanced Load Balancing

Divides and distributes traffic among multiple tools based on a variety of data points.

Advanced Flow Slicing

Reduces bandwidth usage by slicing payloads and packets from long data flows.

Masking

Provides customizable data protection by overwriting specific packet fields with a set pattern. 

Tunneling

Alleviates blindness of business-critical traffic at remote sites, virtualized data centers, or in the public or private cloud.

Application Visualization

Provides a complete view of the applications running on your network.

Application Filtering Intelligence

Extends Layer 7 visibility to thousands of common and proprietary applications.

Application Metadata Intelligence

Provides application visibility to identify performance and quality issues, and potential security risks.

GTP Correlation

Enables mobile service providers to monitor subscriber data in GPRS Tunneling Protocol tunnels.

FlowVUE Flow Sampling

Provides subscriber IP-based flow sampling.

5G Correlation

Forwards subscriber sessions to specific tools by filtering on subscriber, user device, RAN, or network slice IDs.

SSL/TLS Decryption

Creates a secure channel between the server and the end user's computer or other devices as they exchange information.

colored-bar

Take a Gigamon Tour

See the tech. Touch the tech.

Frequently Asked Questions

Data de-duplication is a network optimization process that identifies and eliminates duplicate packets from network traffic streams. 

The meaning of de-duplication extends to any technique that removes redundant data copies while preserving unique information. In networking contexts, this process ensures that monitoring and security tools receive only distinct packets, preventing system overload and improving analysis accuracy.

“Dedupe” is the most common abbreviated term for de-duplication. Other related terms include data reduction, duplicate elimination, and redundancy removal. In network monitoring specifically, it's often referred to as packet de-duplication or traffic optimization.

A practical example occurs when multiple network monitoring points capture the same packets due to SPAN port configurations or inter-VLAN communication. Packet de-duplication technology identifies these identical packets by analyzing headers and payload content, then eliminates the duplicates while forwarding only unique packets to security tools.

A real-world scenario involves organizations with VMware virtual environments, where VM-to-VM traffic creates significant packet duplication. When multiple monitoring systems attempt to capture the same inter-VM communications, dedupe technology is essential to prevent tool overload. By implementing data de-duplication, network teams can focus their security and monitoring tools on unique traffic flows rather than processing redundant packet copies.

For comprehensive implementation details and measurable results, read the complete Gigamon case study showcasing successful de-duplication deployment in action.

Packet de-duplication ensures that network monitoring and security tools process only unique packets, improving accuracy, reducing false positives, and conserving bandwidth and storage. Without dedupe functionality, duplicate packets can comprise a large portion of network traffic, overwhelming monitoring infrastructure and causing:

  • Inaccurate threat detection and false security alerts
  • Inflated performance metrics leading to misdiagnosis
  • Premature storage capacity exhaustion
  • Reduced forensic analysis effectiveness

Data de-duplication and compression are two optimization techniques that address different aspects of data management. Packet de-duplication removes redundant packets from the stream entirely, ensuring only unique traffic reaches monitoring tools. Compression, on the other hand, reduces the size of existing data without eliminating any information.

While compression makes data smaller for storage or transmission efficiency, de-duplication eliminates unnecessary data copies to improve processing accuracy. These techniques complement each other — networks can implement both packet de-duplication to remove duplicate traffic and compression to optimize remaining data storage requirements 

Have Questions?

We're here to help you find the right solution for your business.

By submitting this form, you agree to our Terms & Agreement. View our Privacy Statement.

Related Pages