AI - We hate it. But it ain’t leaving so…
If you’re a crochet designer - hell ANY artist - online right now, you’ve probably had this thought:
“I don’t want AI training on my work.”
Fair.
A lot of artists and crafters are uncomfortable watching AI companies vacuum up tutorials, photos, charts, blog posts, and patterns to train systems that may eventually compete with creators themselves. Unfortunately, blind revenge probably is not going to work the way people hope. And don’t shoot me, but it does have its uses. Even art uses…she says ducking for cover. But the long and short of it is, we ain’t gonna beat it so figure out how to at least live with it.
Or try to fight back.
One idea that spread quickly was model collapse, which is the belief that flooding the internet with low-quality AI content (“AI slop”) would poison future AI models until they became useless.
It sounds terrifically petty and satisfying, BIG BUT:
What Is “Model Collapse”?
Model Collapse is a real concept in machine learning.
In simple terms:
AI models trained repeatedly on AI-generated material can slowly degrade over time.
Errors compound.
Diversity shrinks.
Outputs become repetitive or distorted.
Researchers have demonstrated versions of this effect in controlled settings.
But the internet is not a controlled setting.
Large AI companies:
filter datasets heavily
prioritize trusted/high-authority sources
remove duplicated content
use human preference tuning
combine synthetic and human-created datasets
continually retrain on fresh material
So adding a few pages of nonsense to your crochet site is unlikely to meaningfully damage a frontier model.
What it is more likely damaging, is your own site quality.
The Real Risk: Hurting Your Own SEO
Search engines increasingly evaluate:
content quality
originality
expertise
user trust
engagement
consistency
If your website suddenly contains:
gibberish pages
keyword-stuffed AI text
mass-generated filler
low-quality duplicate content
…it may weaken your site’s reputation over time.
And for crochet businesses, search visibility matters.
Because whether we like it or not:
Pinterest traffic matters
Google image search matters
discoverability matters
tutorials and blogs drive pattern sales
If search engines start viewing your site as low trust, the collateral damage can hit harder than the AI crawlers.
So you can’t poison them, but you can create some friction and put up some boundaries.
1. Use robots.txt (Even Though It’s Voluntary)
A robots.txt file tells compliant crawlers what they can and cannot access. Legit companies will likely comply because when these things go to court creators can point to their site or content having been on a “do not crawl” setting. Leaving Zorg Corporation to explain to an already pissy jury why they did.
Here are quick instructions for turning on or editing robots.txt on popular website platforms, especially useful if you want to discourage AI scrapers like GPTBot, ClaudeBot, or Common Crawl.
Squarepsace
Go to Settings
Click Developer Tools
Open Custom Robots.txt
Add rules like:
User-agent: GPTBot
Disallow: /
User-agent: ClaudeBot
Disallow: /
User-agent: CCBot
Disallow: /
User-agent: Google-Extended
Disallow: /Save changes
Your robots file will live at:
https://yourdomain.com/robots.txt
WordPress
Option 1: Plugin (easiest)
Install:
Yoast SEO
Rank Math
All in One SEO
Then:
SEO Settings → Tools/File Editor → robots.txt
Option 2: Manual
Upload/edit a robots.txt file in your website root folder via hosting panel or FTP.
Shopify
Shopify now allows editing robots.txt on most themes.
Online Store
Themes
Edit Code
Add new template → robots.txt.liquid
Add:
User-agent: GPTBot
Disallow: /Save.
Wix
Go to SEO Settings
Open Advanced SEO
Find robots.txt editor
Add custom rules
Webflow
Site Settings
SEO tab
Scroll to robots.txt
Paste custom rules
Weebly / Square Online
Weebly/Square Online has very limited robots.txt control.
Usually:
Settings → SEO
Add header/meta restrictions only
For full robots.txt control, many users put the site behind:
Cloudflare
reverse proxy/CDN rules
But what about my patterns?
2. Keep Premium Content Behind a Paywall
Public pages are easier to scrape. Protected content is harder. This is one reason many designers are shifting toward:
member areas
subscriber libraries
private PDFs
gated tutorials
Patreon-style access
Even simple friction helps.
A downloadable PDF inside an account system is significantly less exposed than a fully public blog post.
Platforms like Ribblr also create partial ecosystem protection because patterns live inside their platform environment rather than as fully crawlable webpages.
That doesn’t make them immune to copying. But it changes the scraping dynamics.
3. Be Intentional About What You Publish Publicly
Some designers now:
publish teaser versions publicly
watermark your pages
keep full charts inside PDFs
omit stitch counts from blog previews
use lower-resolution photos online
avoid posting entire premium patterns openly
That balance lets search engines discover your work without giving away the entire product.
4. Use Clear Licensing Terms
Licensing will not magically stop infringement. But clear terms matter. Unfortunately standard licenses like Creative Commons don’t address AI training, and RAIL is more geared for code and model conduct. But I have added this to all of my patterns going forward (and am updating to go backwards):
(C) 2026 Huskyberry LLC. All Rights Reserved. This work may not be used for training, fine-tuning, or developing artificial intelligence or machine learning systems without explicit written permission.
Some designers include other clauses such as:
personal use only
no AI training
no dataset inclusion
no redistribution
no commercial resale of pattern text or charts
This area is still evolving legally. But documenting your intent is increasingly important.
5. Build Community, Not Just Content
This is the uncomfortable truth:
AI can imitate instructions.
It cannot replace:
community
personality
trust
teaching style
humor
creative voice
lived expertise
People follow crochet designers because they enjoy them.
Your:
weird yarn opinions
cat interruptions
project chaos
color choices
encouragement
mistakes
personality
…are part of the value.
That is much harder to scrape.
The Reality: There Is No Perfect Shield
If you publish online, some level of scraping risk exists.
That’s true for crochet, illustration, writing, photography, coding, literally everything online.
Perfect protection will not be possible. The goal should be to:
increase friction
reducing unnecessary exposure
protecting premium work
maintaining search visibility
making informed platform choices
building a business people want to support directly
Because ironically, discoverability and protection often pull in opposite directions. The more public your work is, the easier it is to find. The easier it is to find, the easier it is to scrape.
So the real challenge for creators now is balance, not panic.
And honestly? Crochet designers are already pretty good at balancing tension.
This is not legal advice. I am not an engineer or attorney, just an unwilling vessel of knowledge in this area.