Editorial Policy & The Loadsyn Testing Methodology
Our reputation is built on the transparency and rigor of our data. This policy details our commitment to editorial independence and the specific, standardized process by which we collect, verify, and publish all technical data.
Part I: Loadsyn Editorial Independence
Our commitment to verifiable truth is non-negotiable. This section outlines the principles that govern our reporting, ensuring our analysis remains impartial and focused solely on the data.
- No Payment for Placement: We do not accept payment, compensation, or any form of consideration in exchange for a positive review score, chart placement, or product recommendation.
- Separation of Lab and Sales: Our testing specialists (like Samantha Hayes or Marcus Coleman) operate independently of any affiliate or advertising teams. Their final score and data analysis are final and non-negotiable.
- Loaner Policy: All hardware loaned to Loadsyn for review is returned or purchased after the review embargo expires. The publication of data is never contingent on the manufacturer’s approval or conditional on a commercial agreement.
- Corrections and Retraction: We maintain an open policy for correcting factual errors immediately. If an error is found in our published data, for example, a software bug impacts a 1% Low reading, we will issue a visible correction, re-test, and update the chart.
Part II: The Standardized Testing Methodology
This is the core of our authority. Our data is not based on anecdotal feeling; it is the product of standardized scripts, controlled environments, and specialized logging hardware. All data is collected and signed by the specialist responsible for that metric.
1. Performance Analysis (GPU & CPU)
Authored and signed by the Performance Analysis & Benchmarks team.
- Focus on Consistency: We use custom logging software to capture frame-time consistency across extended 30-minute runs, not just a 5-minute average. This is how we isolate and report on true 1% and 0.1% Lows.
- The Golden Rig Standard: All GPU and CPU testing is conducted on a single, standardized test platform, known as ‘The Golden Rig,’ with the OS minimized and services optimized for minimal variance, ensuring results are comparable across years.
- Power Efficiency: We record instantaneous power draw efficiency and temperatures via dedicated logging tools, connecting performance directly to thermal and power usage metrics.
2. Low-Level Latency and Input Metrics
Authored and signed by the Input Latency Devices and Display Tech specialists.
- Hardware-Based Verification: We use the Open-Source Response Time Tool (OSRTT) and high-speed cameras to measure total system input lag from click to photon, bypassing software-based API reporting that can be misleading.
- Motion Clarity: Display reviews use standardized tests for MPRT (Motion Picture Response Time) and GtG (Gray-to-Gray) to quantify motion blur and ghosting, moving beyond the manufacturer’s advertised refresh rates.
3. Optimization and AI Fidelity
Authored and signed by the Optimization Science & AI Tech team.
- Pixel-Level Artifacting: We conduct direct, pixel-to-pixel comparison and analysis of DLSS, FSR, and XeSS outputs to isolate, categorize, and quantify visual artifacts, such as shimmering and ghosting, which informs our final visual fidelity score.
- Code and Driver Analysis: Optimization guides are based on analyzing driver overhead and low-level system settings, publishing guides that are rooted in verifiable system behavior, not guesswork.
4. Engineering Deconstruction
Authored and signed by the Hardware Engineering Deconstructed team.
- Physical Analysis: Technical articles often begin with a complete component teardown to visually inspect PCB layouts, VRM component count, and thermal solution design.
- Die-Shot Interpretation: We publish and interpret die-shots to explain the physical micro-architecture behind a processor, including core layout and cache topology, and how that directly translates to in-game performance.