Methodology
The boring but important part. How we verify everything you read on this site.
Every program in the affiliatejob directory is verified manually every 30 to 60 days. Verification combines the program's official affiliate page, the network's public terms, community submitted payout reports, and our own monthly checks. Reliability scores are calculated from the trailing 12 months of data with sample sizes shown. Featured listings are tagged paid placement and do not influence editorial rankings or reliability scores.
Why methodology matters
Affiliate marketing has a credibility problem. Most directories list programs based on whatever the program told them once, then never re check. Most reviews are written by affiliates who haven't actually run the program. Most "rankings" are bought and sold without disclosure. The result is an industry where serious operators don't trust public data and rely on private Telegram groups and Slack channels instead.
This site exists to be different. The methodology is what makes the difference real instead of marketing copy. If we say a program pays 30 percent recurring with a 60 day cookie, we want that statement to be defensible against any program owner, journalist, or skeptic who reads it. Below is exactly how we get there.
The five step verification process
Step 1. Read the official affiliate page
Pull up the program's actual affiliate page directly from the company's website (not a third party review). Note the headline commission, recurring vs one time, cookie length, payout method, payout schedule, minimum threshold, two tier availability, and tracking platform. Screenshot saved. Date stamped.
Step 2. Read the network terms
If the program runs on Impact, PartnerStack, ShareASale, CJ Affiliate, Awin, Rakuten, ClickBank, MaxBounty, or another tracked network, pull up the public network terms page and the program's specific listing on that network. Verify the headline commission matches what the network says (sometimes marketing pages overstate). Note any clauses about brand bidding, allowed traffic types, and refund clawback windows.
Step 3. Cross check community payout reports
Search AffiliateFix forum, AffLIFT community, /r/Affiliatemarketing, and a handful of private affiliate Slacks for "[program name] payout" or "[program name] paid" in the trailing 12 months. We're looking for: actual payout screenshots, complaints about delays or missed payments, mentions of commission scrubbing or unusual term changes, and discussions of the program's responsiveness on disputes.
Step 4. Update the JSON file
Each program lives in a single JSON file in our content repository. We update commission rate, cookie days, payout method, last verified date, and reliability score. Git keeps the full history so you can see exactly what changed and when.
Step 5. Rebuild the static page
The build script regenerates the program's HTML page with the updated data and a new "last verified" timestamp. Schema markup updates automatically (Product, Review, FAQPage). The sitemap updates. The page deploys to the Hetzner VPS and Cloudflare CDN. Total time from data change to live page: typically under 10 minutes.
How reliability scores are calculated
Network reliability scores (A through F) and program reliability scores (also A through F) come from the same underlying data, weighted slightly differently for each.
The data sources
- Community payout receipts submitted via our anonymous form. We accept screenshots with affiliate IDs and dollar amounts redacted, dates visible. Required: program name or network name, payment date, payment method, scheduled date if delayed.
- Public network history: payout schedule changes, program closures, term changes announced by the network. Tracked monthly via the network's announcement page or status page.
- Our own monthly checks: we run small affiliate accounts on the major networks specifically to verify payment timing first hand. Sample size is small but it confirms what the community reports say.
- Public dispute records where networks publish them. Impact, PartnerStack, and ShareASale all have some level of public dispute resolution data we can reference.
The grading scale
| Grade | Meaning | Threshold |
|---|---|---|
| A | Excellent reliability, no significant issues | 95%+ on time payments, zero unreported delays in trailing 12 months, n >= 20 |
| A- | Strong reliability, minor issues | 90%+ on time, occasional 1 to 3 day delays, n >= 15 |
| B+ | Good with isolated incidents | 85%+ on time, 1 to 2 reported issues handled within 30 days |
| B | Acceptable, watch closely | 80%+ on time, multiple isolated incidents but no systemic issues |
| B- | Trending down or recovering | 75%+ on time, recent improvement or recent decline |
| C+ to C- | Concerning, recommend caution | 60 to 75% on time, multiple unresolved disputes, slow CS response |
| D | Multiple unresolved payment issues | Below 60% on time, pattern of delayed or partial payments |
| F | Do not recommend | Multiple confirmed cases of non payment, network closing, or affiliate terms violations against affiliates |
Sample sizes shown next to every score. A grade with n=4 is not the same as A grade with n=87. The directory shows both so you can judge confidence.
Scores update quarterly unless something major happens (network announces a payment delay, program closes, multiple new payout reports change the picture). Major changes trigger same week updates.
How second tier availability is verified
Two tier and sub affiliate programs are particularly worth verifying carefully because the term gets misused. Some programs market a "refer a friend" credit as second tier (it's not). Some are technically MLM with multiple levels. Some have the structure but rarely actually pay tier two commissions.
Our criteria for listing a program as having a real two tier component:
- Public affiliate terms explicitly describe a two tier or sub affiliate structure with a stated percentage
- The structure is exactly two levels (tier 1 plus tier 2). Three or more tiers is MLM, not affiliate, and we don't list those
- Tier 2 commissions are paid as a percentage of the referred affiliate's actual product commissions (not a flat bounty for signing them up)
- The relationship is tracked automatically by the program's platform without manual reconciliation required
- At least one community member has confirmed receiving a tier 2 payment within the last 12 months (or the program is recent enough that we can verify the tracking is live)
Programs that fail any of these criteria don't get the two tier tag, even if they market themselves as having one.
How we handle program submissions
The submission flow at /submit is described there. Here's the editorial process after submission.
Initial review (within 48 hours)
The submitter's data is checked against the program's actual affiliate page. If the program owner submitted it themselves, we still verify the data independently. Press release language is rewritten in our editorial voice. Marketing claims are stripped if not verifiable.
Approval or rejection
Approved programs go to the writeup phase. Rejected programs get a brief written reason. Common rejection reasons: program is closed, terms are unverifiable, network has F grade reliability, category is excluded (adult, unregulated crypto, predatory lending), or the submitter is clearly trying to game the directory.
Writeup (within 24 hours of approval)
An editor writes the 2,000 word review including specs, pros, cons, who should and shouldn't promote it, FAQ specific to the program, and competitive positioning against similar programs. The writeup is editorial. The program owner does not see it before publication. We don't run press releases.
Live deployment
Page goes live at /programs/[slug]. Schema markup attaches. Sitemap updates. The program owner gets an email with the URL and the schema validator results. They can request corrections within 7 days; substantive editorial points are not negotiable but factual errors are fixed.
How we handle errors
We make mistakes. Programs change terms without announcing them. Sometimes our community sources are wrong. Sometimes we just type the wrong number into a JSON file.
The error correction process: email [email protected] with the URL and the specific issue. Documentation helps but isn't required. We respond within one business day acknowledging the issue. Verified errors are fixed within 48 hours and the page shows a new "last verified" date. Major errors (commission rate wrong by a factor of 2, program listed as paying when it's stopped, etc.) get same day fixes when possible.
We don't quietly fix errors hoping nobody notices. The git history of the content repository is public eventually. The fix is the fix; transparency is part of the methodology.
What we deliberately don't do
- We don't auto fetch program pages with scrapers. Manual review catches things scrapers miss (footnotes, restrictions, network changes).
- We don't accept program owner submitted reviews verbatim. Press release language doesn't appear in editorial.
- We don't backdate verifications. The "last verified" date is the actual date a human re checked the program.
- We don't list programs we can't verify. If the affiliate page is gated behind a signup that we can't access, the program doesn't get listed until we can verify the data publicly.
- We don't accept payment to delete negative reviews. Featured listing buys placement, not editorial spin.
- We don't share community submitted payout data with anyone. Reports are aggregated into reliability scores. Individual reports stay confidential.