[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"$f13cOpkFZysNQSvAI3COgZusUjJ3B5F7WPpaxMoj6mC4":3},{"slug":4,"name":5,"version":6,"author":7,"author_profile":8,"description":9,"short_description":10,"active_installs":11,"downloaded":12,"rating":11,"num_ratings":11,"last_updated":13,"tested_up_to":14,"requires_at_least":15,"requires_php":16,"tags":17,"homepage":13,"download_link":23,"security_score":24,"vuln_count":11,"unpatched_count":11,"last_vuln_date":25,"fetched_at":26,"vulnerabilities":27,"developer":28,"crawl_stats":25,"alternatives":36,"analysis":138,"fingerprints":287},"scraperguard","ScraperGuard – AI Scraper Blocker","1.0.0","KNEET","https:\u002F\u002Fprofiles.wordpress.org\u002Fkneet\u002F","\u003Cp>ScraperGuard helps you block known AI scrapers (often called “good bots”) by matching their User-Agent string.\u003C\u002Fp>\n\u003Cp>You can:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Select specific bots to block, or block all known bots.\u003C\u002Fli>\n\u003Cli>Add your own custom User-Agent substrings (one per line).\u003C\u002Fli>\n\u003Cli>Block via Apache \u003Ccode>.htaccess\u003C\u002Fcode> (fast, before WordPress loads) \u003Cstrong>or\u003C\u002Fstrong> via WordPress-level blocking (can show basic stats).\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>Important notes:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>This plugin can block “good bots” that identify themselves. It cannot stop “bad bots” that ignore rules and\u002For spoof User-Agents. For that you may need additional security measures (WAF, rate limiting, bot protection).\u003C\u002Fli>\n\u003Cli>\u003Ccode>.htaccess\u003C\u002Fcode> blocking works on Apache hosting only, and requires a writable \u003Ccode>.htaccess\u003C\u002Fcode> file.\u003C\u002Fli>\n\u003Cli>WordPress-level blocking only affects requests that reach WordPress (it won’t block direct hits to static files unless they route through WordPress).\u003C\u002Fli>\n\u003Cli>Country blocking (geo blocking) can use a country header (fast) or an IP lookup (works without Cloudflare but is slower).\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>The settings page is under \u003Cstrong>Tools \u003Cspan aria-hidden=\"true\" class=\"wp-exclude-emoji\">→\u003C\u002Fspan> ScraperGuard\u003C\u002Fstrong>.\u003C\u002Fp>\n\u003Ch3>External Services\u003C\u002Fh3>\n\u003Cp>This plugin can optionally connect to third-party IP geolocation services to determine the visitor’s country for country-based blocking. This feature is \u003Cstrong>disabled by default\u003C\u002Fstrong> and only activates when you explicitly enable “Country blocking” in the settings.\u003C\u002Fp>\n\u003Cp>\u003Cstrong>When country blocking is enabled and the “Country detection method” is set to “Auto” or “IP lookup”:\u003C\u002Fstrong>\u003C\u002Fp>\n\u003Cul>\n\u003Cli>\u003Cstrong>Service used\u003C\u002Fstrong>: The plugin uses either ipwho.is or ipapi.co (configurable in settings)\u003C\u002Fli>\n\u003Cli>\u003Cstrong>Data sent\u003C\u002Fstrong>: The visitor’s IP address is sent to the selected service\u003C\u002Fli>\n\u003Cli>\u003Cstrong>When data is sent\u003C\u002Fstrong>: Only when a request is received and no country header is available from your server\u002Fproxy\u003C\u002Fli>\n\u003Cli>\u003Cstrong>Purpose\u003C\u002Fstrong>: To determine the visitor’s country code (ISO-2) for geo-blocking\u003C\u002Fli>\n\u003Cli>\u003Cstrong>Caching\u003C\u002Fstrong>: Results are cached locally for 24 hours by default (configurable 1-168 hours) to minimize requests\u003C\u002Fli>\n\u003Cli>\u003Cstrong>Privacy\u003C\u002Fstrong>: IP addresses are sent to external services. Ensure compliance with your privacy policy and local regulations.\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>\u003Cstrong>ipwho.is (default provider):\u003C\u002Fstrong>\u003Cbr \u002F>\n* Service provider: ipwho.is\u003Cbr \u002F>\n* Privacy policy: https:\u002F\u002Fipwho.is\u002F\u003Cbr \u002F>\n* Terms of service: https:\u002F\u002Fipwho.is\u002F\u003Cbr \u002F>\n* No API key required\u003C\u002Fp>\n\u003Cp>\u003Cstrong>ipapi.co (alternative provider):\u003C\u002Fstrong>\u003Cbr \u002F>\n* Service provider: ipapi.co\u003Cbr \u002F>\n* Privacy policy: https:\u002F\u002Fipapi.co\u002Fprivacy\u002F\u003Cbr \u002F>\n* Terms of service: https:\u002F\u002Fipapi.co\u002Fterms\u002F\u003Cbr \u002F>\n* No API key required for basic usage\u003C\u002Fp>\n\u003Cp>\u003Cstrong>Important\u003C\u002Fstrong>: If you keep the “Country detection method” set to “Header only” (the default), or if you don’t enable country blocking at all, no data is sent to external services.\u003C\u002Fp>\n","Block “good bots” (AI scrapers) by User-Agent. Optional Apache .htaccess rules and WordPress-level blocking with basic stats.",0,109,"","6.9.4","5.8","7.4",[18,19,20,21,22],"ai","bots","htaccess","scraper","user-agent","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Fscraperguard.1.0.0.zip",100,null,"2026-03-15T10:48:56.248Z",[],{"slug":29,"display_name":7,"profile_url":8,"plugin_count":30,"total_installs":31,"avg_security_score":32,"avg_patch_time_days":33,"trust_score":34,"computed_at":35},"kneet",6,1010,97,30,92,"2026-04-05T20:19:16.776Z",[37,59,79,101,120],{"slug":38,"name":39,"version":40,"author":41,"author_profile":42,"description":43,"short_description":44,"active_installs":11,"downloaded":45,"rating":11,"num_ratings":11,"last_updated":46,"tested_up_to":47,"requires_at_least":48,"requires_php":49,"tags":50,"homepage":55,"download_link":56,"security_score":57,"vuln_count":11,"unpatched_count":11,"last_vuln_date":25,"fetched_at":58},"check-system-details","Check System Details","1.1.1","Prabhat","https:\u002F\u002Fprofiles.wordpress.org\u002Fprabhatrai\u002F","\u003Ch3>Check System Details – Easily Check Your Installation and Server Details\u003C\u002Fh3>\n\u003Ch3>**No Setup Required**\u003C\u002Fh3>\n\u003Ch3>The details include:\u003C\u002Fh3>\n\u003Cp>WordPress Details\u003C\u002Fp>\n\u003Cul>\n\u003Cli>WordPress address\u003C\u002Fli>\n\u003Cli>Site address\u003C\u002Fli>\n\u003Cli>WordPress version\u003C\u002Fli>\n\u003Cli>WordPress multisite\u003C\u002Fli>\n\u003Cli>WordPress memory\u003C\u002Fli>\n\u003Cli>WordPress debug\u003C\u002Fli>\n\u003Cli>WordPress debug\u003C\u002Fli>\n\u003Cli>WordPress cron\u003C\u002Fli>\n\u003Cli>WordPress Language\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>Current Theme Details\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Name\u003C\u002Fli>\n\u003Cli>Version\u003C\u002Fli>\n\u003Cli>Description\u003C\u002Fli>\n\u003Cli>Author\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>Installed Plugin Details\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Plugin name with hyperlink to plugin’s website\u003C\u002Fli>\n\u003Cli>Version\u003C\u002Fli>\n\u003Cli>Author\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>Server Details\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Max input timeout\u003C\u002Fli>\n\u003Cli>Max input vars\u003C\u002Fli>\n\u003Cli>Max file upload\u003C\u002Fli>\n\u003Cli>Max file upload\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>Database Details\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Database Name\u003C\u002Fli>\n\u003Cli>Database Host\u003C\u002Fli>\n\u003Cli>Database Charset\u003C\u002Fli>\n\u003Cli>Database Prefix\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>Names of existing Database Tables\u003C\u002Fp>\n\u003Ch3>New Addition\u003C\u002Fh3>\n\u003Cp>htaccess File Content\u003C\u002Fp>\n\u003Cp>Robots.txt Content\u003C\u002Fp>\n\u003Cp>Debug log content – Last 100 lines\u003C\u002Fp>\n\u003Ch3>A quick overview of the plugin:\u003C\u002Fh3>\n\u003Cspan class=\"embed-youtube\" style=\"text-align:center; display: block;\">\u003Ciframe loading=\"lazy\" class=\"youtube-player\" width=\"750\" height=\"422\" src=\"https:\u002F\u002Fwww.youtube.com\u002Fembed\u002FVTXt1PNqXkU?version=3&rel=1&showsearch=0&showinfo=1&iv_load_policy=1&fs=1&hl=en-US&autohide=2&wmode=transparent\" allowfullscreen=\"true\" style=\"border:0;\" sandbox=\"allow-scripts allow-same-origin allow-popups allow-presentation allow-popups-to-escape-sandbox\">\u003C\u002Fiframe>\u003C\u002Fspan>\n","Easily check your WordPress installation and server details along with database tables, installed plugins, and the active theme.",898,"2023-07-03T06:28:00.000Z","6.2.9","4.9","5.6",[51,20,52,53,54],"database-details","robots-txt","server-details","wordpress-details","https:\u002F\u002Fgithub.com\u002FPrabhatKumarRai\u002Fcheck-system-details","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Fcheck-system-details.1.1.1.zip",85,"2026-03-15T15:16:48.613Z",{"slug":60,"name":61,"version":62,"author":63,"author_profile":64,"description":65,"short_description":66,"active_installs":67,"downloaded":68,"rating":69,"num_ratings":70,"last_updated":71,"tested_up_to":72,"requires_at_least":73,"requires_php":13,"tags":74,"homepage":13,"download_link":78,"security_score":34,"vuln_count":11,"unpatched_count":11,"last_vuln_date":25,"fetched_at":58},"spiderblocker","Spider Blocker","1.3.7","Niteo","https:\u002F\u002Fprofiles.wordpress.org\u002Fniteoweb\u002F","\u003Cp>Spider Blocker blocks most common bots that consume bandwidth and slow down your blog.\u003Cbr \u002F>\nIt accomplishes this by using .htaccess file to minimize impact on your website. It’s hidden from external scanners.\u003C\u002Fp>\n\u003Cp>Spider Blocker is specifically designed for Apache servers with mod_rewrite enabled, allowing you to effortlessly safeguard your website from the most prevalent bots that hamper performance and drain resources.\u003C\u002Fp>\n\u003Ch4>Plugin Features\u003C\u002Fh4>\n\u003Cul>\n\u003Cli>Block Unlimited bots from viewing your site\u003C\u002Fli>\n\u003Cli>Easy Export\u002FImport rules (comes with most common list of bots)\u003C\u002Fli>\n\u003Cli>Zero Footprint\u003C\u002Fli>\n\u003C\u002Ful>\n","SpiderBlocker will block most common bots that consume bandwidth and slow down your blog.",20000,612410,80,5,"2024-05-07T13:39:00.000Z","6.5.8","4.0",[75,76,19,20,77],"apache","block","seo","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Fspiderblocker.1.3.7.zip",{"slug":80,"name":81,"version":82,"author":83,"author_profile":84,"description":85,"short_description":86,"active_installs":87,"downloaded":88,"rating":89,"num_ratings":90,"last_updated":91,"tested_up_to":14,"requires_at_least":92,"requires_php":16,"tags":93,"homepage":13,"download_link":97,"security_score":98,"vuln_count":99,"unpatched_count":11,"last_vuln_date":100,"fetched_at":58},"better-robots-txt","Better Robots.txt – AI-Ready Crawl Control & Bot Governance","3.0.0","Pagup","https:\u002F\u002Fprofiles.wordpress.org\u002Fpagup\u002F","\u003Cp>Better Robots.txt replaces the default WordPress robots.txt workflow with a smarter, structured version you can configure and preview before publishing.\u003C\u002Fp>\n\u003Cp>Instead of a blank textarea, you get a guided wizard with presets, plain-language explanations, and a final Review & Save step so you can inspect the generated robots.txt before it goes live.\u003C\u002Fp>\n\u003Cp>Built for beginners and advanced users alike, Better Robots.txt helps you control how search engines, AI crawlers, SEO tools, archive bots, bad bots, social preview bots, and other automated agents interact with your site.\u003C\u002Fp>\n\u003Cp>Trusted by thousands of WordPress sites, Better Robots.txt is designed for the AI era without resorting to hype, vague promises, or hidden rules.\u003C\u002Fp>\n\u003Cp>Better Robots.txt is available in Free, Pro, and Premium editions. The free plugin covers the guided workflow and essential crawl control features, while Pro and Premium unlock additional governance, protection, and AI-ready modules. Some screenshots on the plugin page show features from all three editions.\u003C\u002Fp>\n\u003Ch3>A quick overview\u003C\u002Fh3>\n\u003Cp>\u003Ciframe loading=\"lazy\" title=\"Better robots.txt Video — AI-Ready Crawl Control for WordPress\" src=\"https:\u002F\u002Fplayer.vimeo.com\u002Fvideo\u002F1169756981?dnt=1&app_id=122963\" width=\"750\" height=\"372\" frameborder=\"0\" allow=\"autoplay; fullscreen; picture-in-picture; clipboard-write; encrypted-media; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\">\u003C\u002Fiframe>\u003C\u002Fp>\n\u003Ch3>Why Better Robots.txt is different\u003C\u002Fh3>\n\u003Cp>Most robots.txt plugins fall into one of three categories:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Simple text editor\u003C\u002Fli>\n\u003Cli>Virtual robots.txt manager\u003C\u002Fli>\n\u003Cli>Single-purpose AI or policy add-on\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>Better Robots.txt goes further.\u003C\u002Fp>\n\u003Cp>It gives you a complete, guided crawl control workflow so you can:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Choose a preset that matches your goals\u003C\u002Fli>\n\u003Cli>Control major crawler categories without writing everything by hand\u003C\u002Fli>\n\u003Cli>Keep core WordPress protection rules visible and editable\u003C\u002Fli>\n\u003Cli>Clean up low-value crawl paths that waste crawl budget\u003C\u002Fli>\n\u003Cli>Generate a cleaner robots.txt output\u003C\u002Fli>\n\u003Cli>Preview the final result before saving\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Ch3>What you can control\u003C\u002Fh3>\n\u003Cp>Better Robots.txt helps you manage:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Search engine visibility\u003C\u002Fli>\n\u003Cli>AI and LLM crawler behavior\u003C\u002Fli>\n\u003Cli>AI usage signals such as search, ai-input, and ai-train preferences\u003C\u002Fli>\n\u003Cli>SEO tool crawlers\u003C\u002Fli>\n\u003Cli>Bad bots and abusive crawlers\u003C\u002Fli>\n\u003Cli>Archive and Wayback access\u003C\u002Fli>\n\u003Cli>Feed crawlers and crawl traps\u003C\u002Fli>\n\u003Cli>WooCommerce crawl cleanup\u003C\u002Fli>\n\u003Cli>CSS, JavaScript, and image crawling rules\u003C\u002Fli>\n\u003Cli>Social media preview crawlers\u003C\u002Fli>\n\u003Cli>ads.txt and app-ads.txt allowance\u003C\u002Fli>\n\u003Cli>llms.txt generation\u003C\u002Fli>\n\u003Cli>Advanced directives such as crawl-delay and custom rules\u003C\u002Fli>\n\u003Cli>Final review before publishing\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Ch3>Editions\u003C\u002Fh3>\n\u003Cp>Better Robots.txt is available in three editions:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Free – Includes the guided setup, the Essential preset, core crawl control features, and the final Review & Save workflow.\u003C\u002Fli>\n\u003Cli>Pro – Adds more advanced governance and protection modules, including additional AI, crawler, and cleanup controls.\u003C\u002Fli>\n\u003Cli>Premium – Unlocks the most restrictive and advanced protection options, including the Fortress preset and additional high-control modules.\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>Some options shown in the interface are marked Free, Pro, or Premium so users can immediately understand which modules belong to each edition.\u003C\u002Fp>\n\u003Ch3>Presets\u003C\u002Fh3>\n\u003Cp>Setup starts with four modes:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Essential – A clean, practical configuration for most websites that want a better robots.txt without complexity.\u003C\u002Fli>\n\u003Cli>AI-First – For publishers and content sites that want AI-ready governance without shutting down discovery.\u003C\u002Fli>\n\u003Cli>Fortress – For websites that want stronger protection against scraping, archive capture, and unnecessary crawl activity.\u003C\u002Fli>\n\u003Cli>Custom – For users who prefer to configure each module manually.\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>For many sites, one preset plus a quick review is enough.\u003C\u002Fp>\n\u003Ch3>Built for beginners and experts\u003C\u002Fh3>\n\u003Cp>Beginners get:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>A guided setup instead of a raw robots.txt box\u003C\u002Fli>\n\u003Cli>Preset-based configuration\u003C\u002Fli>\n\u003Cli>Plain-language explanations for important choices\u003C\u002Fli>\n\u003Cli>A safer workflow with a final preview step\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>Advanced users get:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Editable core WordPress protection rules\u003C\u002Fli>\n\u003Cli>Fine-grained crawler controls by category\u003C\u002Fli>\n\u003Cli>WooCommerce-oriented cleanup options\u003C\u002Fli>\n\u003Cli>Consolidated output options\u003C\u002Fli>\n\u003Cli>Advanced directives and custom rules\u003C\u002Fli>\n\u003Cli>A final output they can inspect before publishing\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Ch3>AI-ready, without hype\u003C\u002Fh3>\n\u003Cp>Better Robots.txt includes features for modern AI-related crawl governance, including:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>AI crawler handling\u003C\u002Fli>\n\u003Cli>Optional llms.txt support\u003C\u002Fli>\n\u003Cli>AI usage signals for compliant systems\u003C\u002Fli>\n\u003Cli>Optional machine-readable governance signals for advanced use cases\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>These features help you express how you want automated systems to use your content.\u003C\u002Fp>\n\u003Cp>However, Better Robots.txt does not claim to control AI by force. Like robots.txt itself, these signals are most useful with compliant systems and good-faith crawlers.\u003C\u002Fp>\n\u003Ch3>What Better Robots.txt is\u003C\u002Fh3>\n\u003Cp>Better Robots.txt is:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>A robots.txt governance plugin for WordPress\u003C\u002Fli>\n\u003Cli>A guided configuration workflow instead of a raw text editor\u003C\u002Fli>\n\u003Cli>A crawl control layer to reduce wasteful crawling\u003C\u002Fli>\n\u003Cli>A practical bridge between SEO, crawl hygiene, and AI-era policy signaling\u003C\u002Fli>\n\u003Cli>A way to keep your crawl policy clearer for humans and machines\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>Technical reference for advanced users: Better Robots.txt also maintains a public \u003Ca href=\"https:\u002F\u002Fgithub.com\u002FGautierDorval\u002Fbetter-robots-txt\" rel=\"nofollow noopener noreferrer ugc\">GitHub repository\u003C\u002Fa> with product definition, governance notes, and machine-readable artefacts.\u003C\u002Fp>\n\u003Ch3>What Better Robots.txt is not\u003C\u002Fh3>\n\u003Cp>Better Robots.txt is not:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>A firewall or Web Application Firewall (WAF)\u003C\u002Fli>\n\u003Cli>An anti-scraping enforcement engine\u003C\u002Fli>\n\u003Cli>A legal compliance engine\u003C\u002Fli>\n\u003Cli>A guarantee that every bot will obey your rules\u003C\u002Fli>\n\u003Cli>A replacement for server-level security or access control\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>It helps you publish a clearer crawl policy.\u003C\u002Fp>\n\u003Cp>It does not replace infrastructure-level protection.\u003C\u002Fp>\n\u003Ch3>Typical use cases\u003C\u002Fh3>\n\u003Cp>Use Better Robots.txt if you want to:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Clean up a weak or noisy default robots.txt\u003C\u002Fli>\n\u003Cli>Reduce crawl waste on WordPress or WooCommerce\u003C\u002Fli>\n\u003Cli>Keep major search engines allowed while restricting other bots\u003C\u002Fli>\n\u003Cli>Control whether archive bots can snapshot your site\u003C\u002Fli>\n\u003Cli>Publish AI usage preferences more clearly\u003C\u002Fli>\n\u003Cli>Keep social preview bots allowed while limiting scrapers\u003C\u002Fli>\n\u003Cli>Review the final file before making it live\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Ch3>Key Features\u003C\u002Fh3>\n\u003Cul>\n\u003Cli>Guided step-by-step wizard\u003C\u002Fli>\n\u003Cli>Preset-based setup: Essential, AI-First, Fortress, Custom\u003C\u002Fli>\n\u003Cli>Search engine visibility controls\u003C\u002Fli>\n\u003Cli>AI and LLM crawler governance\u003C\u002Fli>\n\u003Cli>AI usage signals support\u003C\u002Fli>\n\u003Cli>SEO tool crawler controls\u003C\u002Fli>\n\u003Cli>Bad bot and abusive crawler options\u003C\u002Fli>\n\u003Cli>Archive and Wayback access controls\u003C\u002Fli>\n\u003Cli>Spam, feed, and crawl trap cleanup\u003C\u002Fli>\n\u003Cli>WooCommerce crawl cleanup options\u003C\u002Fli>\n\u003Cli>CSS, JavaScript, and image crawling rules\u003C\u002Fli>\n\u003Cli>Social media preview crawler controls\u003C\u002Fli>\n\u003Cli>ads.txt and app-ads.txt allowance\u003C\u002Fli>\n\u003Cli>Optional llms.txt generation\u003C\u002Fli>\n\u003Cli>Consolidated output option\u003C\u002Fli>\n\u003Cli>Core WordPress protection rules remain visible and editable\u003C\u002Fli>\n\u003Cli>Final Review & Save preview screen\u003C\u002Fli>\n\u003C\u002Ful>\n","Replace the default WordPress robots.txt workflow with a smarter, structured version you can preview before publishing, with Free, Pro, and Premium ed &hellip;",6000,305034,90,102,"2026-03-10T18:33:00.000Z","5.0",[94,95,96,52,77],"ai-crawlers","bot-blocker","llms-txt","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Fbetter-robots-txt.3.0.0.zip",99,2,"2023-02-14 00:00:00",{"slug":102,"name":103,"version":104,"author":102,"author_profile":105,"description":106,"short_description":107,"active_installs":108,"downloaded":109,"rating":110,"num_ratings":111,"last_updated":112,"tested_up_to":113,"requires_at_least":114,"requires_php":115,"tags":116,"homepage":118,"download_link":119,"security_score":34,"vuln_count":11,"unpatched_count":11,"last_vuln_date":25,"fetched_at":58},"chatbase","Chatbase","1.0.4","https:\u002F\u002Fprofiles.wordpress.org\u002Fchatbase\u002F","\u003Cp>Custom ChatGPT for your website. Build a custom GPT, embed it on your website and let it handle customer support, lead generation, engage with your users, and more.\u003C\u002Fp>\n\u003Cp>With this plugin, you can embed your \u003Ca href=\"https:\u002F\u002Fwww.chatbase.co\u002F\" rel=\"nofollow ugc\">Chatbase\u003C\u002Fa> chatbot directly into your WordPress site.\u003C\u002Fp>\n\u003Cp>This plugin utilizes Chatbase services to deploy a pre-trained chatbot on your website, leveraging the data you’ve uploaded to the Chatbase application. For more information on \u003Ca href=\"https:\u002F\u002Fwww.chatbase.co\u002Fprivacy\" rel=\"nofollow ugc\">privacy\u003C\u002Fa>, and \u003Ca href=\"https:\u002F\u002Fwww.chatbase.co\u002Fterms\" rel=\"nofollow ugc\">terms of service\u003C\u002Fa>, please refer to our dedicated sections on these topics.\u003C\u002Fp>\n","Custom ChatGPT for your website. Build a custom GPT, embed it on your website and let it handle customer support, lead generation, engage with your us &hellip;",5000,22666,68,13,"2025-03-03T17:04:00.000Z","6.7.5","4.7","7.0",[18,102,117],"chatbots","https:\u002F\u002Fwww.chatbase.co","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Fchatbase.1.0.4.zip",{"slug":121,"name":122,"version":123,"author":124,"author_profile":125,"description":126,"short_description":127,"active_installs":128,"downloaded":129,"rating":130,"num_ratings":70,"last_updated":131,"tested_up_to":14,"requires_at_least":132,"requires_php":133,"tags":134,"homepage":13,"download_link":137,"security_score":24,"vuln_count":11,"unpatched_count":11,"last_vuln_date":25,"fetched_at":58},"block-ai-crawlers","Block AI Crawlers","1.5.6","lastsplash (a11n)","https:\u002F\u002Fprofiles.wordpress.org\u002Flastsplash\u002F","\u003Cp>Protect Your Content from AI Scraping\u003C\u002Fp>\n\u003Cp>This plugin helps you prevent AI crawlers from using your content as training data for their products. By updating your site’s \u003Ccode>robots.txt\u003C\u002Fcode>, it blocks common AI crawlers and scrapers, aiming to protect your content from being used in the training of Large Language Models (LLMs).\u003C\u002Fp>\n\u003Ch3>Features\u003C\u002Fh3>\n\u003Ch3>Blocks AI Crawlers\u003C\u002Fh3>\n\u003Cp>Includes:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>\u003Cstrong>OpenAI\u003C\u002Fstrong> – Blocks crawlers used for ChatGPT\u003C\u002Fli>\n\u003Cli>\u003Cstrong>Google\u003C\u002Fstrong> – Blocks crawlers used by Google’s Gemini AI products\u003C\u002Fli>\n\u003Cli>\u003Cstrong>Facebook \u002F Meta\u003C\u002Fstrong> – Used for Facebook’s AI training\u003C\u002Fli>\n\u003Cli>\u003Cstrong>Anthropic AI\u003C\u002Fstrong> – Blocks crawlers used by Anthropic  \u003C\u002Fli>\n\u003Cli>\u003Cstrong>Perplexity\u003C\u002Fstrong> – Block crawlers used by Perplexity\u003C\u002Fli>\n\u003Cli>\u003Cstrong>Applebot\u003C\u002Fstrong> – Blocks crawlers used by Apple\u003C\u002Fli>\n\u003Cli>… and more!\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Ch3>Experimental Meta Tags\u003C\u002Fh3>\n\u003Cp>The plugin adds the “noai, noimageai” directive to your site’s meta tags, instructing AI bots not to use your content in their datasets. Please note that these tags are experimental and have not been standardized.\u003C\u002Fp>\n\u003Ch3>Custom robots.txt Rules\u003C\u002Fh3>\n\u003Cp>Have custom entries for your robots.txt file? You can now add them directly through the plugin!\u003C\u002Fp>\n\u003Ch3>Usage\u003C\u002Fh3>\n\u003Cp>After activation, the plugin will automatically update your \u003Ccode>robots.txt\u003C\u002Fcode> and add the necessary meta tags. No further configuration is required, but you can check the settings page for a full list of blocked crawlers.\u003C\u002Fp>\n\u003Ch3>Limitations\u003C\u002Fh3>\n\u003Cp>While this plugin aims to block specified crawlers, it cannot guarantee complete protection against all forms of scraping, as some bots may disregard \u003Ccode>robots.txt\u003C\u002Fcode> directives.\u003C\u002Fp>\n\u003Ch3>Support\u003C\u002Fh3>\n\u003Cp>For questions or support, \u003Ca href=\"https:\u002F\u002Fwordpress.org\u002Fsupport\u002Fplugin\u002Fblock-ai-crawlers\u002F\" rel=\"ugc\">please post on the forums\u003C\u002Fa> or \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fbobmatyas\u002Fwp-block-ai-crawlers\u002Fissues\" rel=\"nofollow ugc\">on GitHub\u003C\u002Fa>.\u003C\u002Fp>\n","Tell AI (Artificial Intelligence) companies not to scrape your site for their AI products.",1000,13412,96,"2026-02-15T13:47:00.000Z","6.8","8.2",[18,135,136,52],"chatgpt","crawlers","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Fblock-ai-crawlers.1.5.6.zip",{"attackSurface":139,"codeSignals":159,"taintFlows":277,"riskAssessment":278,"analyzedAt":286},{"hooks":140,"ajaxHandlers":155,"restRoutes":156,"shortcodes":157,"cronEvents":158,"entryPointCount":11,"unprotectedCount":11},[141,147,150,152],{"type":142,"name":143,"callback":144,"file":145,"line":146},"action","admin_notices","closure","scraperguard.php",49,{"type":142,"name":148,"callback":144,"file":145,"line":149},"admin_init",62,{"type":142,"name":151,"callback":144,"file":145,"line":24},"admin_menu",{"type":142,"name":153,"callback":144,"priority":11,"file":145,"line":154},"plugins_loaded",108,[],[],[],[],{"dangerousFunctions":160,"sqlUsage":161,"outputEscaping":163,"fileOperations":11,"externalRequests":275,"nonceChecks":275,"capabilityChecks":99,"bundledLibraries":276},[],{"prepared":11,"raw":11,"locations":162},[],{"escaped":164,"rawEcho":165,"locations":166},60,53,[167,171,173,175,177,179,181,183,185,187,189,191,193,195,197,199,201,203,205,207,209,211,213,215,217,219,221,223,225,227,229,231,233,235,237,239,241,243,245,247,249,251,253,255,257,259,261,263,265,267,269,271,273],{"file":168,"line":169,"context":170},"includes\\class-asb-admin.php",126,"raw output",{"file":168,"line":172,"context":170},136,{"file":168,"line":174,"context":170},138,{"file":168,"line":176,"context":170},139,{"file":168,"line":178,"context":170},140,{"file":168,"line":180,"context":170},141,{"file":168,"line":182,"context":170},142,{"file":168,"line":184,"context":170},151,{"file":168,"line":186,"context":170},156,{"file":168,"line":188,"context":170},158,{"file":168,"line":190,"context":170},162,{"file":168,"line":192,"context":170},206,{"file":168,"line":194,"context":170},208,{"file":168,"line":196,"context":170},211,{"file":168,"line":198,"context":170},216,{"file":168,"line":200,"context":170},218,{"file":168,"line":202,"context":170},224,{"file":168,"line":204,"context":170},229,{"file":168,"line":206,"context":170},231,{"file":168,"line":208,"context":170},232,{"file":168,"line":210,"context":170},237,{"file":168,"line":212,"context":170},242,{"file":168,"line":214,"context":170},246,{"file":168,"line":216,"context":170},248,{"file":168,"line":218,"context":170},253,{"file":168,"line":220,"context":170},257,{"file":168,"line":222,"context":170},259,{"file":168,"line":224,"context":170},260,{"file":168,"line":226,"context":170},261,{"file":168,"line":228,"context":170},265,{"file":168,"line":230,"context":170},267,{"file":168,"line":232,"context":170},268,{"file":168,"line":234,"context":170},270,{"file":168,"line":236,"context":170},273,{"file":168,"line":238,"context":170},279,{"file":168,"line":240,"context":170},283,{"file":168,"line":242,"context":170},285,{"file":168,"line":244,"context":170},288,{"file":168,"line":246,"context":170},293,{"file":168,"line":248,"context":170},297,{"file":168,"line":250,"context":170},299,{"file":168,"line":252,"context":170},304,{"file":168,"line":254,"context":170},305,{"file":168,"line":256,"context":170},306,{"file":168,"line":258,"context":170},321,{"file":168,"line":260,"context":170},322,{"file":168,"line":262,"context":170},329,{"file":168,"line":264,"context":170},345,{"file":168,"line":266,"context":170},349,{"file":168,"line":268,"context":170},350,{"file":168,"line":270,"context":170},367,{"file":145,"line":272,"context":170},54,{"file":145,"line":274,"context":170},55,1,[],[],{"summary":279,"deductions":280},"The scraperguard plugin v1.0.0 exhibits a generally positive security posture, with no recorded vulnerabilities (CVEs) and a seemingly small attack surface.  The absence of detected dangerous functions, raw SQL queries, file operations, and critical taint flows is commendable.  However, a significant concern arises from the output escaping. With only 53% of outputs properly escaped, there's a substantial risk of Cross-Site Scripting (XSS) vulnerabilities. While the plugin includes nonce and capability checks, the overall lack of widespread and consistent sanitization across all output points leaves it susceptible to attacks that could lead to code execution or data theft within the WordPress environment.  The plugin's single external HTTP request should also be monitored, as the target of this request could potentially be compromised or malicious.",[281,284],{"reason":282,"points":283},"Output escaping only 53% proper",8,{"reason":285,"points":99},"Single external HTTP request","2026-03-17T05:53:40.704Z",{"wat":288,"direct":297},{"assetPaths":289,"generatorPatterns":292,"scriptPaths":293,"versionParams":294},[290,291],"\u002Fwp-content\u002Fplugins\u002Fscraperguard\u002Fassets\u002Fcss\u002Fadmin.css","\u002Fwp-content\u002Fplugins\u002Fscraperguard\u002Fassets\u002Fjs\u002Fadmin.js",[],[],[295,296],"scraperguard\u002Fassets\u002Fcss\u002Fadmin.css?ver=","scraperguard\u002Fassets\u002Fjs\u002Fadmin.js?ver=",{"cssClasses":298,"htmlComments":300,"htmlAttributes":302,"restEndpoints":303,"jsGlobals":304,"shortcodeOutput":306},[299],"scraperguard-options",[301],"\u003C!-- ScraperGuard admin page -->",[],[],[305],"eigen_asb_options_obj",[]]