Clean data is the foundation of accurate analytics. Twinalyze helps you control event quality, identify tracking issues, and keep your reports, funnels, retention, user profiles, campaigns, and Ask AI insights reliable. This page covers:Documentation Index
Fetch the complete documentation index at: https://docs.paraminternationalltd.com/llms.txt
Use this file to discover all available pages before exploring further.
- Duplicate and missing events
- Wrong event names
- Invalid event properties
- Bot and unwanted traffic
- Test, internal, and spam traffic
- Best practices before going live
Data quality overview
| Area | What can go wrong | Impact |
|---|---|---|
| Duplicate Events | Same event is sent more than once | Inflated reports and wrong counts |
| Missing Events | Expected event is not received | Broken funnels and incomplete journeys |
| Wrong Event Names | Same action has multiple names | Split reports and confusing analysis |
| Invalid Properties | Missing or wrong property values | Filters, segments, and cohorts become unreliable |
| Bot Traffic | Non-human activity is tracked | Fake sessions, users, and page views |
| Internal/Test Traffic | QA or developer data mixes with live data | Production reports become inaccurate |
Data quality checks
- Duplicate Events
- Missing Events
- Wrong Event Names
- Invalid Properties
Duplicate events happen when the same user action is tracked more than once.Common examples:
button_clickedfires two times for one clickscreen_viewedfires again when the screen is already visiblepurchase_completedis sent multiple times for one order- SDK auto tracking and manual tracking both send the same event
How to prevent duplicate events
Use unique identifiers
Use values like
eventId, orderId, sessionId, or screenId for important events.Check sudden spikes
Review events that suddenly increase without a real product or marketing change.
Recommended duplicate-safe event
Bot and unwanted traffic
Bot traffic and unwanted traffic can make your analytics data inaccurate. Twinalyze should help you separate real user behavior from fake, automated, internal, or test activity.Cleaner Reports
Remove bot-like activity from reports, funnels, retention, and user insights.
Better User Accuracy
Separate real users from fake, test, crawler, or automated traffic.
Improved Decisions
Avoid making product decisions based on invalid traffic.
Common unwanted traffic types
- Bots
- Internal Traffic
- Spam Traffic
Bots are automated systems that visit your website or app without real user intent.Common examples:
- Search engine crawlers
- Scrapers
- Automated testing tools
- Headless browser traffic
- Fake traffic generators
Bot indicators
| Signal | Example |
|---|---|
| Very high event frequency | Hundreds of events in a short time |
| No real interaction pattern | Page views without clicks or meaningful actions |
| Suspicious user agent | Headless browser, crawler, or automation tool |
| Repeated IP/device pattern | Same source creating many sessions |
| Invalid session behavior | Session duration too short or too repetitive |
Recommended bot filtering rules
Use these checks to separate real users from unwanted traffic.| Check | What to detect | Recommended action |
|---|---|---|
| User agent | Headless browsers, crawlers, automation tools | Mark as bot or exclude |
| Event frequency | Too many events in a short time | Flag as suspicious |
| Session pattern | Very short or repeated sessions | Review or exclude |
| IP pattern | Repeated traffic from same source | Add to filter list |
| Environment | development, staging, debug | Exclude from production reports |
| Internal users | Team, QA, admin, developer traffic | Mark as internal |
| Invalid properties | Missing user agent, device, platform, session | Flag for review |