Methodology

The boring but important part. How we verify everything you read on this site.

Every program in the affiliatejob directory is verified manually every 30 to 60 days. Verification combines the program's official affiliate page, the network's public terms, community submitted payout reports, and our own monthly checks. Reliability scores are calculated from the trailing 12 months of data with sample sizes shown. Featured listings are tagged paid placement and do not influence editorial rankings or reliability scores.

Why methodology matters

Affiliate marketing has a credibility problem. Most directories list programs based on whatever the program told them once, then never re check. Most reviews are written by affiliates who haven't actually run the program. Most "rankings" are bought and sold without disclosure. The result is an industry where serious operators don't trust public data and rely on private Telegram groups and Slack channels instead.

This site exists to be different. The methodology is what makes the difference real instead of marketing copy. If we say a program pays 30 percent recurring with a 60 day cookie, we want that statement to be defensible against any program owner, journalist, or skeptic who reads it. Below is exactly how we get there.

The five step verification process

Step 1. Read the official affiliate page

Pull up the program's actual affiliate page directly from the company's website (not a third party review). Note the headline commission, recurring vs one time, cookie length, payout method, payout schedule, minimum threshold, two tier availability, and tracking platform. Screenshot saved. Date stamped.

Step 2. Read the network terms

If the program runs on Impact, PartnerStack, ShareASale, CJ Affiliate, Awin, Rakuten, ClickBank, MaxBounty, or another tracked network, pull up the public network terms page and the program's specific listing on that network. Verify the headline commission matches what the network says (sometimes marketing pages overstate). Note any clauses about brand bidding, allowed traffic types, and refund clawback windows.

Step 3. Cross check community payout reports

Search AffiliateFix forum, AffLIFT community, /r/Affiliatemarketing, and a handful of private affiliate Slacks for "[program name] payout" or "[program name] paid" in the trailing 12 months. We're looking for: actual payout screenshots, complaints about delays or missed payments, mentions of commission scrubbing or unusual term changes, and discussions of the program's responsiveness on disputes.

Step 4. Update the JSON file

Each program lives in a single JSON file in our content repository. We update commission rate, cookie days, payout method, last verified date, and reliability score. Git keeps the full history so you can see exactly what changed and when.

Step 5. Rebuild the static page

The build script regenerates the program's HTML page with the updated data and a new "last verified" timestamp. Schema markup updates automatically (Product, Review, FAQPage). The sitemap updates. The page deploys to the Hetzner VPS and Cloudflare CDN. Total time from data change to live page: typically under 10 minutes.

How reliability scores are calculated

Network reliability scores (A through F) and program reliability scores (also A through F) come from the same underlying data, weighted slightly differently for each.

The data sources

The grading scale

GradeMeaningThreshold
AExcellent reliability, no significant issues95%+ on time payments, zero unreported delays in trailing 12 months, n >= 20
A-Strong reliability, minor issues90%+ on time, occasional 1 to 3 day delays, n >= 15
B+Good with isolated incidents85%+ on time, 1 to 2 reported issues handled within 30 days
BAcceptable, watch closely80%+ on time, multiple isolated incidents but no systemic issues
B-Trending down or recovering75%+ on time, recent improvement or recent decline
C+ to C-Concerning, recommend caution60 to 75% on time, multiple unresolved disputes, slow CS response
DMultiple unresolved payment issuesBelow 60% on time, pattern of delayed or partial payments
FDo not recommendMultiple confirmed cases of non payment, network closing, or affiliate terms violations against affiliates

Sample sizes shown next to every score. A grade with n=4 is not the same as A grade with n=87. The directory shows both so you can judge confidence.

Scores update quarterly unless something major happens (network announces a payment delay, program closes, multiple new payout reports change the picture). Major changes trigger same week updates.

How second tier availability is verified

Two tier and sub affiliate programs are particularly worth verifying carefully because the term gets misused. Some programs market a "refer a friend" credit as second tier (it's not). Some are technically MLM with multiple levels. Some have the structure but rarely actually pay tier two commissions.

Our criteria for listing a program as having a real two tier component:

  1. Public affiliate terms explicitly describe a two tier or sub affiliate structure with a stated percentage
  2. The structure is exactly two levels (tier 1 plus tier 2). Three or more tiers is MLM, not affiliate, and we don't list those
  3. Tier 2 commissions are paid as a percentage of the referred affiliate's actual product commissions (not a flat bounty for signing them up)
  4. The relationship is tracked automatically by the program's platform without manual reconciliation required
  5. At least one community member has confirmed receiving a tier 2 payment within the last 12 months (or the program is recent enough that we can verify the tracking is live)

Programs that fail any of these criteria don't get the two tier tag, even if they market themselves as having one.

How we handle program submissions

The submission flow at /submit is described there. Here's the editorial process after submission.

Initial review (within 48 hours)

The submitter's data is checked against the program's actual affiliate page. If the program owner submitted it themselves, we still verify the data independently. Press release language is rewritten in our editorial voice. Marketing claims are stripped if not verifiable.

Approval or rejection

Approved programs go to the writeup phase. Rejected programs get a brief written reason. Common rejection reasons: program is closed, terms are unverifiable, network has F grade reliability, category is excluded (adult, unregulated crypto, predatory lending), or the submitter is clearly trying to game the directory.

Writeup (within 24 hours of approval)

An editor writes the 2,000 word review including specs, pros, cons, who should and shouldn't promote it, FAQ specific to the program, and competitive positioning against similar programs. The writeup is editorial. The program owner does not see it before publication. We don't run press releases.

Live deployment

Page goes live at /programs/[slug]. Schema markup attaches. Sitemap updates. The program owner gets an email with the URL and the schema validator results. They can request corrections within 7 days; substantive editorial points are not negotiable but factual errors are fixed.

How we handle errors

We make mistakes. Programs change terms without announcing them. Sometimes our community sources are wrong. Sometimes we just type the wrong number into a JSON file.

The error correction process: email [email protected] with the URL and the specific issue. Documentation helps but isn't required. We respond within one business day acknowledging the issue. Verified errors are fixed within 48 hours and the page shows a new "last verified" date. Major errors (commission rate wrong by a factor of 2, program listed as paying when it's stopped, etc.) get same day fixes when possible.

We don't quietly fix errors hoping nobody notices. The git history of the content repository is public eventually. The fix is the fix; transparency is part of the methodology.

What we deliberately don't do

FAQ about methodology

How often is each program re verified?
Every 30 to 60 days. Programs in volatile categories (crypto adjacent, new SaaS launches) get checked monthly. Established programs (Impact, PartnerStack, mature SaaS) get checked every 60 days. Major events (program closure, term change announcement) trigger immediate re verification.
What if I think a reliability score is wrong?
Email [email protected] with the program or network name and the specific score concern. Include any payout data you can share (screenshots with personal info redacted). We re evaluate within 7 days and respond with either a score update or a written explanation of why the score stands.
Can program owners see our internal data?
No. The aggregated reliability score is public. The individual community submitted reports that feed into it are confidential. Program owners can submit their own data via the contact form to be considered alongside community reports.
Why don't you list every affiliate program?
Quality over breadth. There are 100,000+ affiliate programs in the world. Most are noise. Our 800+ are the ones worth knowing about across our covered categories. Pure CPA offers go to OfferVault. Adult and unregulated crypto go to specialist directories. We focus on programs with verifiable terms and real product offerings.
How do you avoid bias from featured listing payments?
Editorial and commercial are separate. The editor writing the review and updating the data does not see who has paid for featured. Featured listings are tagged on every page so visitors can identify paid placements. The editor's compensation does not depend on featured listing revenue.
Why do you publish your methodology publicly?
Trust requires verifiability. If we explain how we work, you can judge whether the work is legit. Hidden methodology is a red flag in any directory or review site. Ours is public and auditable.