PCI DSS Scoping Engine
Not every PCI DSS requirement applies to every merchant. The Scoping Engine automatically adjusts the assessment to reflect the merchant’s actual environment by hiding requirements that are not relevant and marking the corresponding fields as Not Applicable.How Scoping Works
The scoping engine evaluates a set of scoping rules against the assessor’s answers to scoping questions. Each rule consists of:- A condition — a field path, an operator, and an expected value.
- An action — what to do when the condition is met (
hide_requirement,show_requirement, orset_na).
Supported Condition Operators
| Operator | Behavior |
|---|---|
equals | Field value exactly matches the expected value |
not_equals | Field value does not match |
contains | Field value (string) contains the expected substring |
not_contains | Field value does not contain the substring |
exists | Field has a non-empty value |
not_exists | Field is empty, null, or undefined |
Built-In Scoping Rules
Kliper ships with scoping rules derived from the official PCI DSS v4.0.1 ROC template. These rules cover the most common scoping scenarios:Wireless Technology
Wireless Technology
Scoping question: Does the entity use wireless technologies?When answered No, the following requirements are automatically hidden:
Additional sub-rules for wireless scanning method (11.1.c) and automated monitoring (11.1.d) are evaluated independently based on whether those specific techniques are in use.
| Hidden Requirement | Description |
|---|---|
| 1.2.3 | Wireless access points configuration |
| 2.1.1 | Wireless vendor defaults changed |
| 4.1.1 | Wireless transmission encryption |
| 11.1 | Wireless access point testing |
| 11.2.1 | Wireless scanning processes |
| 11.2.2 | Wireless IDS/IPS deployment |
End-User Messaging
End-User Messaging
Scoping question: Does the entity transmit cardholder data via end-user messaging technologies?When answered No, Requirement 4.2.2 (securing end-user messaging technologies) is hidden.
Service Provider Status
Service Provider Status
Scoping question: Is the assessed entity a service provider?When answered No, Requirement 12.9 (service provider acknowledgment of responsibilities) is hidden.
P2PE Solution
P2PE Solution
Scoping question: Does the entity use a PCI-listed P2PE solution?When answered Yes, Requirement 3.4 (PAN rendering unreadable) is hidden — the P2PE solution addresses this requirement.
Cardholder Data Storage
Cardholder Data Storage
Scoping question: Does the entity store cardholder data?When answered No, Requirement 3.1 (cardholder data retention policies) is hidden.
Network Segmentation
Network Segmentation
Scoping question: Does the entity use network segmentation to reduce PCI DSS scope?When answered No, Requirement 6.1 (segmentation testing) is hidden.
Scoping Evaluation Flow
Re-Scoping
Scoping is not permanent. If the assessor changes a scoping answer (e.g., updates “Uses wireless?” from No to Yes), the engine re-evaluates all rules immediately. Previously hidden requirements reappear in the workbench, and their N/A markers are cleared. No assessment data is lost during re-scoping — answers that were previously entered for a now-hidden requirement are preserved and restored if the requirement becomes visible again.Assessment Workbench
The Assessment Workbench is the primary interface where assessors conduct their evaluation. It is designed for extended, focused work sessions on individual requirements.Layout
The workbench uses a three-panel layout:| Panel | Position | Purpose |
|---|---|---|
| Section Tree | Left | Hierarchical navigation of all PCI DSS sections and sub-requirements. Shows completion status per section. |
| Question Panel | Center | The active requirement’s testing procedures, reporting instructions, answer fields, and finding selection. |
| Context Panels | Right (collapsible) | Stacked, collapsible panels for Cortex AI, Attachments, Comments, Collaborators, Gap Assessment, and Audit Trail. |
View Modes
The workbench supports two viewing modes, switchable from the top bar:Sections Mode (default)
Sections Mode (default)
The standard three-panel layout with the Section Tree visible on the left. Use this when you want to see your progress across the full assessment while working on a single requirement.
Focus Mode
Focus Mode
Full-width single-subsection view with no sidebar. Designed for distraction-free work on one requirement at a time.
- Focus nav bar — a thin bar at the top shows the current section heading and subsection label, with Prev and Next buttons for linear navigation through the assessment
- Exit Focus button returns to Sections Mode
- The right-side context panels (Cortex, Attachments, Comments) remain available
Live Presence (Live Status)
When multiple assessors are working on the same assessment, Live Status in the top bar shows who else is present. Each active user appears as an avatar with their name on hover, updated in real time via the platform’s presence system. This makes multi-assessor engagements visible without manual check-ins.Compact Section Rows
When a finding status has been set on a requirement, the row in the section tree collapses to a single status chip showing the current finding (In Place, Not in Place, Not Applicable, or Not Tested). All four options only appear when no finding is set, reducing visual clutter on assessments where most requirements are complete.Section Tree (Left Panel)
The section tree displays all 12 PCI DSS principal requirements and their sub-sections in a collapsible hierarchy. Each node shows:- Requirement number (e.g., 3.4.1)
- Completion indicator — visual status showing whether the requirement has been answered
- Scoping visibility — requirements hidden by the scoping engine do not appear in the tree
Question Panel (Center)
For each requirement, the center panel presents:Testing Procedures
Each testing procedure defined in the ROC template for this requirement. Testing procedures specify what the assessor must examine, interview, or observe. Each procedure has a structured response field.
Reporting Instructions
The ROC template’s reporting instructions — structured guidance on what the assessor must document. These instructions describe which documents to review, which personnel to interview, which configurations to inspect, and what to report.
Validation Steps (Structured Prefix)
Pickable list fields for documenting:
- Documentation Reviewed — link to uploaded evidence files
- Samples Taken — sampling methodology and selections
- Personnel Interviewed — names and roles
- Assessor — lead QSA or associate
- Critical Technologies — systems and components examined
- Settings Reviewed — configuration parameters inspected
- Methods — testing procedures and approaches used
- Software — PCI SSC validated products or other applications
Assessment Finding
A selection for the requirement’s finding status:
- In Place — requirement is fully met
- Not Applicable — requirement does not apply to the assessed environment
- Not Tested — requirement was not evaluated
- Not in Place — requirement is not met
- Compensating Control — Appendix C applies
- Customized Approach — Appendix E applies
Context Panels (Right Side)
The right side of the workbench contains collapsible panels that provide contextual information without leaving the current requirement:Cortex AI Panel
Cortex AI Panel
A chat interface for interacting with Cortex. The assessor can ask questions about the current requirement, request PCI DSS guidance, or trigger auto-fill for the findings description. Cortex responses are contextualized to the specific requirement being worked on.See the Cortex AI guide for details.
Attachments Panel
Attachments Panel
Lists all evidence files uploaded for the current assessment, optionally filtered by section. Each file shows:
- File name and type
- Upload date and uploader
- Malware scan status (clean, pending, quarantined) with per-engine results
- AI validation status (Pending, Complete, Partial)
- Tags (requirement associations, document tags)
Comments Panel
Comments Panel
Threaded, requirement-scoped comments. Assessors can:
- Post comments on specific requirements
- @mention team members (triggers notifications)
- Mark comment threads as resolved
- View comment history and timestamps
Collaborators Panel
Collaborators Panel
Shows team members assigned to the assessment and their roles (Editor, Reviewer, Viewer). Displays live presence — which team members are currently viewing the assessment.
Audit Trail Panel
Audit Trail Panel
A chronological log of every change made to the current requirement — who changed what, when, and the before/after values. Useful for QA review and responding to PCI Council inquiries.
Answer Status Progression
Each assessment answer progresses through a defined status lifecycle:- Pending — initial state. The assessor is still working on the requirement.
- Reviewed — the answer has been reviewed by a peer or supervisor.
- Approved — the answer is finalized and locked for inclusion in the ROC report.
Document Evidence Sync
When files are uploaded to an assessment, Kliper automatically syncs them into the Section 6.4 Documentation Evidence table. This happens transparently:- File is uploaded with optional tags (e.g.,
doctag-DOCFWfor a firewall documentation tag). - The platform creates or updates a row in the 6.4 answer’s
docEvidencearray. - Each row contains: file ID, document reference tag, file name, AI-generated purpose summary, and upload date.
- Manual rows (entered directly by the assessor) are preserved alongside auto-generated rows.
Resume Where You Left Off
The dashboard tracks the last subsection you worked on per-assessment and offers a one-click resume.How It Works
- Per-assessment tracking — when you open a specific requirement in the workbench, the platform stores the assessment ID, subsection ID, and a human-readable label in your browser’s localStorage.
- Dashboard Resume Card — the next time you visit the dashboard, a Resume Card appears with the assessment name, the specific requirement you last worked on, and a Continue button.
- Strict matching — the card only appears when localStorage has a confirmed match for an assessment you still have access to. There is no arbitrary “in progress” fallback — the card either shows the exact location you left, or it doesn’t show at all.
- Per-device — because tracking uses localStorage, the resume position is specific to the browser you were working in. Switching devices starts fresh.
Why It Matters
On long engagements, assessors often work in short sessions spread across days. Resume eliminates the “which requirement was I on?” moment at the start of each session — the dashboard remembers for you.Compact Assessment Hub Cards
The stat cards at the top of the Assessment Hub (Total Requirements, Completed, In Progress, Not Started, etc.) use a compact layout:- Reduced padding, icon size, and font size
- Tighter progress bars
- More cards visible without scrolling