A simple and secure way to protect your website from being scraped by bots
Garlic 🧄🧛
Garlic is a simple, fast and secure way to protect your website from being scraped by bots.
Try Now!
- Go to this website: https://glittery-croquembouche-c25561.netlify.app/
- You will see normal text the way it should be
- Run the following command:
wget https://glittery-croquembouche-c25561.netlify.app/
- Open the index.html
- You will see the encoded text 🙂
How?
Currently, this is in development, but the beta works like this:
React
Install the packages with npm i garlic-react
, you should then import the Garlic
class with:
import Garlic from 'garlic';
You just need to wrap your html in the garlic()
method:
function App() {
return Garlic.clove(
<div className="App">
<p>Go away robots :)</p>
</div>
);
};
Go to index.js
or anywhere before render. Add the following line of code:
Garlic.peal(document);
And all is done!
Astro
Coming soon 🙂
Why?
AI needs data, your website might end-up in the dataset it uses for training. Dont want that? Garlic should help 🙂