Key Takeaways
- The best Meta attribution setting for most advertisers is 7-day click, 1-day view, but lead gen campaigns should start with 1-day click for cleaner data.
- Meta removed 7-day view and 28-day view windows on January 12, 2026, and now requires an actual link click for click-through attribution as of March 3, 2026.
- Meta typically reports 20-50% more conversions than GA4 for the same campaigns due to differences in attribution models and modeled conversions.
- If view-through conversions exceed 25% of your total, test switching to click-only attribution to see if your numbers hold.
- Incrementality testing (not just attribution) is the only way to know if your ads actually caused conversions or just took credit for them.
Table of Contents
- Best Meta Attribution Setting for Most Advertisers
- What Changed in January and March 2026
- Attribution Window Comparison
- Attribution Settings by Vertical
- How to A/B Test Attribution Windows
- The View-Through Problem: When to Trust It and When to Kill It
- Incrementality Testing
- Attribution Settings Cheat Sheet
- FAQ
Best Meta Attribution Setting for Most Advertisers
The best Meta ads attribution setting for most advertisers is 7-day click, 1-day view. It gives Meta's algorithm the widest conversion signal while filtering out low-intent impressions. But it's not universal. If you're running lead gen with a short decision cycle, 1-day click often produces cleaner data. And if you sell high-consideration products, you'll want to compare 7-day click only against the default to see how much view-through is inflating your numbers.
We've managed attribution settings across [NEEDS REAL DATA] Meta ad accounts at Jetfuel. This guide breaks down each window, when to use it, and what changed in January and March 2026.
What Changed in January and March 2026
Two major changes hit Meta's attribution system in early 2026 that every advertiser needs to understand.
January 12, 2026: Meta removed 7-day view and 28-day view attribution windows, shrinking the maximum view-through measurement window from 28 days to 1 day. Advertisers who relied on longer view windows saw reported conversions drop 15-30%. Some advertisers had 30-40% of their conversions attributed to the 8-28 day window that was deprecated. Industries with longer sales cycles (B2B, luxury, high-ticket ecommerce) were hit hardest. (Dataslayer, 2026)
March 3, 2026: Meta's click-through attribution now requires an actual link click. Previously, likes, comments, shares, and other interactions counted as "clicks" for attribution. This is a significant tightening of what counts as an ad-driven conversion. (Jon Loomer Digital, 2026)
Engaged-view update: Meta's engaged-view attribution threshold dropped from 10 seconds to 5 seconds of video watched (or 97% of a short video, whichever comes first). This matters for video advertisers tracking view-through conversions from video ads. (1ClickReport, 2026)
Attribution Window Comparison
Each attribution window serves a different purpose. Here's how they compare side by side.
| Window | Best For | Pros | Cons |
|---|---|---|---|
| 7-day click / 1-day view (default) | Most ecommerce campaigns, broad prospecting | Largest conversion signal for algorithm optimization. Captures delayed purchases and single-session impulse buys. Meta's recommended default. | View-through conversions can inflate ROAS. Hard to compare directly with GA4. Some conversions may be coincidental, not ad-driven. |
| 7-day click only | High-AOV ecommerce, campaigns where you distrust view-through data | Cleaner data by removing view-through noise. Better alignment with GA4 reporting. Still captures multi-day purchase decisions. | Smaller optimization signal for the algorithm. May slow exit from learning phase on lower-budget campaigns. |
| 1-day click | Lead gen, free trials, low-consideration offers, remarketing non-purchase events | Tightest attribution. Best reflects true ad-driven action. Easiest to validate against CRM data. | Misses legitimate multi-day conversion paths. Algorithm gets fewer signals, which can reduce delivery efficiency. |
| 28-day click (API only) | Long sales cycles (B2B, luxury, real estate) | Captures the full decision window for high-consideration purchases. Available via Marketing API and some third-party tools. | Not available in Ads Manager UI since January 2026. Inflates reported numbers. Harder to act on strategically. |
Attribution Settings by Vertical: What Actually Works
Every vertical has a different purchase psychology. The attribution window that works for a $25 skincare brand will wreck reporting for a $3,000 B2B SaaS product. Here's what we've seen work across verticals.
DTC Ecommerce ($20-50 AOV)
Use 7-day click / 1-day view. Products at this price point are impulse-friendly. Someone sees your ad on Monday, thinks about it, buys on Thursday. The 7-day click window captures that. View-through is legitimate here because a $30 purchase doesn't require extensive research.
High-Ticket Ecommerce ($200+ AOV)
Use 7-day click only. Drop view-through entirely. When someone spends $200+, they didn't convert because they passively saw your ad in their feed. They clicked, researched, compared, maybe talked to a partner, then bought. View-through at this price point credits conversions that were already going to happen.
Lead Gen (B2B)
Use 1-day click. B2B lead gen conversions are binary. Someone either fills out the form in the same session or they don't. A 7-day window for lead gen doesn't capture "people who needed more time to decide." It captures people who found you through a Google search three days later and would have converted anyway. The longer window pollutes your lead quality data because you're attributing organic leads to paid ads.
Subscription/Recurring
Use 7-day click and supplement with post-purchase surveys. Subscription products (meal kits, supplements, SaaS) sit in a middle ground. The initial conversion often needs a few days of consideration, but you also need to understand which channel actually drove the decision. Post-purchase surveys ("How did you hear about us?") fill the gap that attribution windows can't.
Local Services
Use 1-day click. Plumbers, dentists, home repair. When someone needs a service, they search, find, and book within hours. There is no multi-day consideration phase for "I need a plumber today."
App Installs
Use 7-day click / 1-day view. Meta's SDK handles most of the attribution heavy lifting here. The default window works well because app install attribution flows through a different pipeline than web conversions. Meta's app event tracking is more reliable than its pixel-based web tracking.
How to A/B Test Attribution Windows (The Actual Process)
Nobody talks about this because it costs money. But if you don't test your attribution windows against real backend data, you're making optimization decisions based on numbers that might be 30-50% off. Here's how we actually run this test.
Step 1: Create two identical campaigns. Same creative. Same audience. Same budget. Same optimization event. The only difference is the attribution setting. Campaign A gets 7-day click / 1-day view (the default). Campaign B gets 7-day click only. You can also test 1-day click as a third variant if budget allows.
Step 2: Run both campaigns for a minimum of two weeks. You need enough conversions in each campaign to reach statistical significance. If you're spending $50/day per campaign and your CPA is $40, two weeks gives you roughly 17-18 conversions per campaign. That's the bare minimum. Three weeks is better. Four is ideal.
Step 3: Pull the numbers from Ads Manager. For each campaign, record: total conversions, cost per conversion, ROAS, and the breakdown of click-through vs. view-through conversions (available in the attribution setting column in Ads Manager).
Step 4: Go to your backend. Pull actual orders from Shopify, actual leads from your CRM, actual signups from your database. Match the date range exactly.
Step 5: Run the Shopify delta method. For each campaign, calculate: (Meta-reported conversions minus actual backend conversions) divided by Meta-reported conversions. That gives you the inflation percentage for each attribution setting.
Step 6: Pick the attribution window where Meta's numbers most closely match your backend. That's your setting going forward. You'll still have some gap because Meta uses modeled conversions for cookie-blocked users, but you want that gap to be as small as possible.
Why most brands skip this test: it requires running duplicate campaigns, which means splitting budget and temporarily reducing efficiency. For a brand spending $5K/month, dedicating $2K to a two-week test feels expensive. But consider the alternative: if you're using the wrong attribution window and your reported ROAS is inflated by 40%, every scaling decision you make is based on bad data. The test pays for itself within the first month of corrected budgets.
How we handle this at Jetfuel: for every new client, we run an attribution window test in the first 30 days. We set aside [NEEDS REAL DATA]% of the monthly budget for the test. We run the Shopify delta method (or CRM delta for lead gen clients) and lock in the attribution setting before we start scaling.
The View-Through Problem: When to Trust It and When to Kill It
View-through attribution is the most controversial setting in Meta ads. Here's what actually happens with a view-through conversion: someone scrolls through their feed, your ad appears on screen for at least one second, they don't click, and then within 24 hours they convert on your site through some other path. Maybe they typed your URL directly. Maybe they clicked a Google ad. Maybe they found you through organic search. Meta counts that as a view-through conversion and attributes it to the ad they saw.
The honest truth: view-through is real for some products and garbage for others.
Where View-Through Works
Impulse-price products in the $10-30 range. Someone sees your ad for a $15 phone case, doesn't click because they're in the middle of something, then goes to your site 20 minutes later and buys. That ad genuinely influenced the purchase. View-through legitimately captures that influence.
Where View-Through Falls Apart
Anything over $100 or anything that involves a research phase. Nobody sees a $500 mattress ad, doesn't click, and then coincidentally buys a $500 mattress from that exact brand within 24 hours because of that fleeting impression. Those people were already in a buying cycle.
How to Test Whether Your View-Through Conversions Are Real
Run a holdout test. Pause all Meta ads for 48-72 hours. If your direct traffic and organic conversions don't drop during that period, then the "view-through conversions" Meta was reporting were happening regardless of whether the ad ran.
This connects directly to the Meta vs. GA4 discrepancy. When brands tell us "Meta says our ROAS is 4x but GA4 says it's 2.5x," view-through is almost always the biggest driver of that gap. GA4 doesn't count view-through at all. Meta counts it by default. The 20-50% conversion gap between platforms shrinks dramatically when you remove view-through from Meta's reporting.
Incrementality Testing: The Only Way to Know If Your Ads Actually Work
Attribution windows tell you who converted after interacting with your ad. They don't tell you whether the ad caused the conversion. That's the difference between attribution and incrementality, and it's the most important distinction in paid media.
Incrementality in plain English: would this person have bought from you even if they never saw the ad? If the answer is yes, the ad didn't generate that sale. It just took credit for it. Incrementality testing measures the gap between "people who saw your ad and converted" and "people who would have converted anyway."
Meta's Conversion Lift Test
Meta splits your target audience into two groups: a test group that sees your ads and a holdout group that doesn't. Both groups are drawn from the same targeting pool. After the test period, you compare conversion rates between the two groups. The difference is your incremental lift.
To run a Conversion Lift test, you need to contact your Meta rep or access it through the Experiments section in Ads Manager. Minimum requirements: you generally need to be spending at least $5,000-10,000/month on the campaign you want to test, and you need to run the test for at least two weeks.
Geographic Holdout Testing (DIY Alternative)
Pick two similar markets. Run ads in Market A but not in Market B. Compare sales in both markets over the same period. If Market A (with ads) sells 20% more than Market B (without ads), you have a rough measure of incrementality. The key: pick markets that are genuinely comparable in population, demographics, and historical sales patterns.
Brand Search Lift Proxy (Quick and Cheap)
Track your branded search volume (via Google Search Console or Google Trends) during periods when Meta ads are running vs. paused. If branded searches drop when you turn off Meta, the ads are driving awareness that leads to branded search. If branded searches stay flat, Meta is mostly capturing existing demand rather than creating new demand.
Meta Attribution Settings Cheat Sheet (2026)
Use this quick reference to pick the right attribution window for your campaign type.
Available Windows (Post-January 2026)
| Setting | What It Measures |
|---|---|
| 1-day click | Conversions within 24hrs of a link click |
| 7-day click | Conversions within 7 days of a link click |
| 1-day view | Conversions within 24hrs of an ad view (no click) |
| 7-day click + 1-day view | Both click and view-through (Meta default) |
Recommended Settings by Campaign Objective
| Campaign Type | Recommended Window | Why |
|---|---|---|
| Awareness / Reach | N/A (no conversion event) | Attribution windows only apply to conversion-optimized campaigns. Track reach and frequency instead. |
| Consideration / Traffic | 7-day click + 1-day view | Default works. You're optimizing for engagement, not purchases. Keep the signal broad. |
| Conversion / Purchase | 7-day click + 1-day view OR 7-day click only | Start with default. If view-through > 25% of conversions, test click-only to validate real impact. |
| Lead Gen (forms, signups) | 1-day click | Short decision cycle. 1-day click gives cleanest signal for leads that convert same-session. |
| Remarketing | 7-day click (purchases) or 1-day click (non-purchase) | Remove view-through for remarketing. These users already know your brand; a view didn't cause the conversion. |
| High-ticket / Long cycle | 7-day click + 1-day view (supplement with API 28-day data) | Use default in Ads Manager. Pull 28-day click data via API or third-party tool for full-funnel analysis. |
Quick Decision Checklist
- Is the conversion free or low-friction? Use 1-day click
- Does the purchase require research/comparison? Use 7-day click
- Is view-through > 25% of conversions? Test click-only
- Are you running remarketing? Remove view-through
- Sales cycle longer than 7 days? Supplement with API data
For related strategies, see our guides on common retargeting mistakes, building lookalike audiences, retargeting strategies for food and beverage brands, and measuring short-form video performance.
Frequently Asked Questions
Which attribution window should I use for lead gen vs. ecommerce?
For lead gen, start with 1-day click. Most lead gen conversions (form fills, demo requests, free downloads) happen in the same session. A 7-day window just adds noise from people who saw your ad and converted organically days later. You want to know what the ad actually caused.
For ecommerce, the default 7-day click / 1-day view usually works best. People browse, compare, wait for payday, then buy. A 1-day window misses that behavior and starves the algorithm of conversion data. If you notice view-through conversions making up more than 25-30% of your total, test switching to 7-day click only to see if your numbers hold up.
How does Meta attribution compare to GA4?
They measure different things, and they'll never match. Meta uses last-touch, event-based attribution: if someone clicked your ad within 7 days (or viewed it within 1 day) and then converted, Meta takes credit. GA4 uses data-driven multi-touch attribution by default: it distributes credit across multiple touchpoints in the conversion path.
The practical result: Meta typically reports 20-50% more conversions than GA4 for the same campaigns. That gap widened after January 2026 when Meta started modeling more conversions for cookie-blocked users. Neither platform is "wrong." Use Meta's numbers to optimize campaigns inside Ads Manager. Use GA4 to understand how Meta fits into your full marketing mix. If you need a single source of truth for boardroom reporting, a third-party tool like Triple Whale or Northbeam can help reconcile the two.
Need Help With Your Meta Attribution Setup?
We run attribution window tests, incrementality analysis, and full-funnel audits for Meta ad accounts. Let us find the right settings for your campaigns.
