Semrush released a study where they analyzed 325,000 prompts across ChatGPT Search, Google AI Mode, and Perplexity and found 89,000 unique LinkedIn URLs cited in AI answers.
The key finding for marketers is citation patterns show different information on how LinkedIn content is most likely to be used in AI search, useful for planning and executing AI search optimization.
It is also worth being clear about the limits. This report doesn’t show whether those citations drove clicks, leads, pipeline, or sales, and it doesn’t show that LinkedIn outperformed a brand’s own site on the same topic.
What it does show is that LinkedIn is being used as a source in AI answers often enough to change how marketers should think about their strategy.
I am breaking down this data and what it means for marketers.
Summary of Key Findings
These are the main findings from the Semrush citation study:
- LinkedIn was the second most cited domain in the dataset and appeared in 11% of AI responses on average across ChatGPT Search, Google AI Mode, and Perplexity.
- Cited LinkedIn content showed high semantic similarity with the AI answers that used it, suggesting that source content can influence how the answer is written too.
- LinkedIn articles accounted for 50% to 66% of cited LinkedIn content, while posts accounted for 15% to 28%.
- The most cited articles were 500 to 2,000 words, and the most cited posts were 50 to 299 words.
- About 95% of cited posts were original rather than reshares, and 54% to 64% were educational or advice-led.
- Around three-quarters of cited post authors had published at least five times in the previous four weeks, while median engagement on cited posts stayed modest at 15 to 25 reactions and no more than one comment.
- Perplexity cited Company Pages most often, while ChatGPT Search and Google AI Mode cited individual members most often.
The report suggests LinkedIn content is being picked up for AI retrieval based more on usefulness, format, originality, and publishing consistency than on popularity.
LinkedIn Is Favored in AI Responses
Semrush’s data shows that LinkedIn appeared in 11% of AI responses on average. The split included ChatGPT Search which cited LinkedIn in 14.3% of responses, Google AI Mode in 13.5%, and Perplexity in 5.3%.

That average is based on a dataset which focuses on professional and B2B-heavy prompts. The sample leaned toward categories like technology, business services, finance, and industrial topics. So this is less useful as a statement about the whole web, and more useful for marketers, SaaS companies, consultants, and other expertise-led brands.
The other important number is where LinkedIn ranked overall. In Semrush’s dataset, LinkedIn was the second most cited domain. This is supported by other data from Profound which finds LinkedIn the top most cited. That puts it ahead of sites many marketers would expect to dominate, including publishers and reference platforms.
The key point here is that LinkedIn has a strong influence in AI search and how LLMs answer questions in B2B categories. If a buyer asks about a topic, a process, or a type of solution, LinkedIn content now has a chance of being part of what they see.
LinkedIn isn’t replacing a website, and it is not suddenly the main source of visibility. But the data does support a change in how it should be viewed. For B2B brands, it can be a channel that influences whether your brand gets pulled into AI responses.
How LinkedIn Content Influences AI Answers
The report found high semantic similarity between LinkedIn content and the citations, with scores ranging from 0.57 to 0.60. That suggests AI wasn’t just using LinkedIn as a reference point. The wording and meaning stayed close to the source content.
That changes the perspective on LinkedIn content. A post or article is not only a way to secure a citation or mention. It can also influence how a topic is framed once it appears in an AI answer.
The larger issue is explanation quality. Many brands use LinkedIn for quick reactions, simplified opinions, rigid social copy, or even worse – recycled AI slop. That can work when the goal is engagement. It’s less effective when the same content might be carrying through to an AI answer and building awareness for that brand.
The difference comes down to specificity. Strong LinkedIn content gives clearer explanations, distinctions, and more focused language about what a product or service does.
So LinkedIn can also affect the quality of the explanation once that brand is included. Think about it this way, if your content get’s pulled and satisfies the user intent (you can’t track this) as it informs the response in the AI answer, that user could come to your brand direct.
LinkedIn Articles Still Carry More Weight Than Many Marketers Assume
The data shows LinkedIn articles accounted for 50% to 66% of cited LinkedIn content, while posts accounted for 15% to 28%. The most cited article length was 500 to 2,000 words. The most cited post length was 50 to 299 words.

It suggests AI tools aren’t pulling from LinkedIn content evenly. They appear to rely more on formats that give them enough context to answer.
Many marketers now look at LinkedIn content by the feed alone. By that standard, articles can be seen as pointless and they usually attract less engagement than posts. But this dataset suggests that articles may be more useful for creating durable source material for AI search.
If a company publishes all of its substantive thinking on its website and uses LinkedIn only for short posts, it may be limiting the amount of retrievable content attached to its presence on LinkedIn. Articles give brands more room to influence citability and the message behind it.
AI Citations Favor Original Posts
About 95% of cited posts were original, while reshares accounted for only around 5%. Semrush also found that roughly 54% to 64% of cited posts were educational or advice-led.

That’s one of the clearest findings in the report. AI doesn’t seem to favor LinkedIn activity in general. They appear to favor posts that contain original explanation.
It’s easy to miss as much of LinkedIn still runs on reaction. A large share of posts are built around reposting news, responding to someone else’s, or adding a quick comment to an existing post. That may still help with engagement on the platform but this data suggests it has hardly any influence on being cited in AI search.
The difference is original posts are more self-contained. They usually include a specific point of view or ideas/explanations. Reshares and commentary often rely on outside context, which makes them less trustworthy as source material for AI.
That doesn’t mean curated content has no value. It means marketers should stop treating curation as equal to original explanation in long-term content value. Each format has a different purpose and this report suggests only one of them is consistently being cited.
The takeaway is actually nothing new, but in an AI search context – first-hand expertise is more important than recycled commentary.
Consistency Appears to be More Important Than Popularity
The data points in that direction. Around three-quarters of cited authors had posted five or more times in the previous four weeks. At the same time, median engagement on cited posts was at roughly 15 to 25 reactions and no more than one comment. Smaller creators also appeared among cited authors.

That weakens the standard assumption that strong feed performance is the clearest sign of content value. In this dataset, cited content didn’t consistently look like viral content. It often looked ordinary by LinkedIn’s engagement standards.
It shows that a post doesn’t need to explode in the feed to have value later. Content with average platform performance may still contribute to whether a brand, topic, or expert gets cited in AI answers.
This is also where vanity-metric thinking starts to break down. High reactions can show reach inside the platform. They don’t appear to be the main factor behind whether content becomes part of what AI cites.
The conclusion is that topic coverage may count for more than occasional standout performance. For marketers, that changes the perspective from trying to get a trending viral post to publishing enough useful material on a subject for expertise to be shown consistently.
Small Experts May Have More Room Than They Think
The follower data is more mixed than many marketers might expect. Nearly half of cited post authors had more than 2,000 followers. Semrush found that creators with fewer than 500 followers were just as likely, and in some cases more likely, to be cited than creators with more than 500 followers.

That doesn’t mean audience size is irrelevant. It means a large following is helpful, but not essential in this dataset. Reach still helps, but smaller creators were not excluded from citation visibility.
That’s the strongest point here for consultants, niche operators, and smaller brands. There is a disadvantage, but it’s not absolute. Clear expertise, regular publishing, and tight topic coverage look more important than trying to look like a large creator – for citations in AI search.
Company Pages and Personal Profiles Both Have a Role
The report found a clear split by AI tool. Perplexity cited Company Pages most often, with Company Pages making up 59% of its LinkedIn citations. ChatGPT Search and Google AI Mode showed individual members making up 59% of LinkedIn citations on each.
That result is interesting because company pages still look weak by normal LinkedIn organic standards.
It’s been a hot topic over the past year or two where LinkedIn page performance has tanked across the board. A seperate report showed that organic company-page posts were reaching only about 1.6% of followers and accounting for just 1% to 2% of LinkedIn feeds, down from 7% in 2021.
So company pages may be weak inside LinkedIn while still being useful citation assets inside AI answers.
Company page content helps AI connect content to the brand, building trust. Founder and employee content helps connect that claim to a person, a point of view, and visible expertise, again building trust.
Trust = increased citation likelihood.
What Marketers Should Change After Reading This Report
TL;DR
- Publish more standalone expertise on LinkedIn
- Prioritize original posts over reshares and commentary
- Use articles for depth and posts for repetition
- Don’t assess value by engagement alone
- Repeat the same topics instead of posting randomly
- Publish from both the brand and the people behind it
LinkedIn should be used more as a place to publish standalone expertise that supports an AI search strategy. That means more original posts with context, more detailed articles, and less reliance on reposts and commentary that depend on someone else’s material.
It also means assessing content differently. Average engagement doesn’t automatically mean low value if a post adds useful language, examples, or perspective around a topic you want to be known for. In this report, consistency and topical focus look more useful than occasional spikes.
Publish around a small number of topics, repeat them across posts and articles, and make sure the LinkedIn version contains enough substance to be pulled.
FAQ – LinkedIn Citations in AI Search
Yes. LinkedIn appeared in 11% of AI responses on average across ChatGPT Search, Google AI Mode, and Perplexity, and it ranked as the second most cited domain in the dataset.
Original, educational, and advice-led content appears most often in citations. LinkedIn articles also made up a larger share of cited content than posts, which suggests AI tools often prefer fuller source material.
Yes, especially when a topic needs more explanation than a short post can hold. They may not attract the most attention in the feed, but the data suggests they can still be useful as citation material in AI answers.
Much less than original posts do. About 95% of cited posts were original, which suggests AI tools are more likely to use content that stands on its own.
Not necessarily. Median engagement on cited posts was modest, which suggests feed popularity is not the same thing as citation value. A post can look average on LinkedIn and still be useful in AI retrieval.
Yes. Larger audiences still help, but smaller creators were also represented in the cited set. That suggests a large following is useful, but not required.
Both. Company pages were cited more often by some AI tools, while others cited individual members more often. Company pages help support the brand, while personal profiles help support expertise and named authority.



ChatGPT
Claude
Perplexity






