[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"$fANLEZZijBt_8peFsSTbp98hg3zNTQWxIx9eVFR98TxI":3},{"slug":4,"name":5,"version":6,"author":7,"author_profile":8,"description":9,"short_description":10,"active_installs":11,"downloaded":12,"rating":13,"num_ratings":13,"last_updated":14,"tested_up_to":15,"requires_at_least":16,"requires_php":17,"tags":18,"homepage":24,"download_link":25,"security_score":26,"vuln_count":13,"unpatched_count":13,"last_vuln_date":27,"fetched_at":28,"vulnerabilities":29,"developer":30,"crawl_stats":27,"alternatives":37,"analysis":132,"fingerprints":166},"search-engines-blocked-warning","Search engines blocked warning","1.0.0","apasionados","https:\u002F\u002Fprofiles.wordpress.org\u002Fapasionados\u002F","\u003Cp>The plugin shows a warning in the WordPress administration header when the option “Search Engine Visibility: Discourage search engines from indexing this site” is enabled.\u003C\u002Fp>\n\u003Cp>\u003Cstrong>It’s very important when moving a website from staging or development to production to remember to switch this option off. Or even better never switch it on.\u003C\u002Fstrong>\u003C\u002Fp>\n\u003Cblockquote>\n\u003Cp>SETTINGS \u002F READING “Search Engine Visibility: Discourage search engines from indexing this site”: Checking this box tells search engines to completely avoid inspecting \u002F indexing the site’s contents, meaning that the site will not show up in search results.\u003C\u002Fp>\n\u003C\u002Fblockquote>\n\u003Ch4>What can I do with this plugin?\u003C\u002Fh4>\n\u003Cp>This plugin shows a WARNING in the administration header of the website when “Search Engine Visibility: Discourage search engines from indexing this site” is enabled.\u003C\u002Fp>\n\u003Ch4>What ideas is this plugin based on?\u003C\u002Fh4>\n\u003Cp>This plugin is based on the idea of the \u003Ca href=\"https:\u002F\u002Fwordpress.org\u002Fplugins\u002Fdiscourage-search-engines-notifier\u002F\" rel=\"ugc\">“Discourage Search Engines Notifier”\u003C\u002Fa> plugin.\u003C\u002Fp>\n\u003Ch4>System requirements\u003C\u002Fh4>\n\u003Cp>PHP version 5.6 or greater.\u003C\u002Fp>\n\u003Ch4>Search engines blocked warning Plugin in your Language!\u003C\u002Fh4>\n\u003Cp>This first release is avaliable in English and Spanish. In the “languages” folder we have included the necessary files to translate this plugin.\u003C\u002Fp>\n\u003Cp>If you would like the plugin in your language and you’re good at translating, please drop us a line at \u003Ca href=\"https:\u002F\u002Fapasionados.es\u002Fcontacto\u002Findex.php?desde=wordpress-org-apa-search-engine-blocked-warning-home\" rel=\"nofollow ugc\">Contact us\u003C\u002Fa>.\u003C\u002Fp>\n\u003Ch4>Further Reading\u003C\u002Fh4>\n\u003Cp>You can access the description of the plugin in Spanish at: \u003Ca href=\"https:\u002F\u002Fapasionados.es\u002Fblog\u002Faviso-de-bloqueo-de-buscadores-wordpress-plugin-7749\u002F\" rel=\"nofollow ugc\">Aviso de bloqueo de buscadores | WordPress Plugin\u003C\u002Fa>.\u003C\u002Fp>\n\u003Ch3>Contact\u003C\u002Fh3>\n\u003Cp>For further information please send us an \u003Ca href=\"https:\u002F\u002Fapasionados.es\u002Fcontacto\u002Findex.php?desde=wordpress-org-apa-search-engine-blocked-warning\" rel=\"nofollow ugc\">email\u003C\u002Fa>.\u003C\u002Fp>\n","Shows a warning in the WordPress administration header when the option \"Search Engine Visibility: Discourage search engines from indexing this si &hellip;",500,2885,0,"2026-01-27T14:11:00.000Z","6.9.4","4.0.1","5.6",[19,20,21,22,23],"block","discourage-search-engines","robots","robots-txt","search-engine-visibility","","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Fsearch-engines-blocked-warning.1.0.0.zip",100,null,"2026-03-15T15:16:48.613Z",[],{"slug":7,"display_name":7,"profile_url":8,"plugin_count":31,"total_installs":32,"avg_security_score":33,"avg_patch_time_days":34,"trust_score":35,"computed_at":36},28,60790,94,326,75,"2026-04-04T02:46:28.202Z",[38,62,81,97,114],{"slug":39,"name":40,"version":41,"author":42,"author_profile":43,"description":44,"short_description":45,"active_installs":46,"downloaded":47,"rating":48,"num_ratings":49,"last_updated":50,"tested_up_to":15,"requires_at_least":51,"requires_php":52,"tags":53,"homepage":24,"download_link":58,"security_score":59,"vuln_count":60,"unpatched_count":13,"last_vuln_date":61,"fetched_at":28},"better-robots-txt","Better Robots.txt – AI-Ready Crawl Control & Bot Governance","3.0.0","Pagup","https:\u002F\u002Fprofiles.wordpress.org\u002Fpagup\u002F","\u003Cp>Better Robots.txt replaces the default WordPress robots.txt workflow with a smarter, structured version you can configure and preview before publishing.\u003C\u002Fp>\n\u003Cp>Instead of a blank textarea, you get a guided wizard with presets, plain-language explanations, and a final Review & Save step so you can inspect the generated robots.txt before it goes live.\u003C\u002Fp>\n\u003Cp>Built for beginners and advanced users alike, Better Robots.txt helps you control how search engines, AI crawlers, SEO tools, archive bots, bad bots, social preview bots, and other automated agents interact with your site.\u003C\u002Fp>\n\u003Cp>Trusted by thousands of WordPress sites, Better Robots.txt is designed for the AI era without resorting to hype, vague promises, or hidden rules.\u003C\u002Fp>\n\u003Cp>Better Robots.txt is available in Free, Pro, and Premium editions. The free plugin covers the guided workflow and essential crawl control features, while Pro and Premium unlock additional governance, protection, and AI-ready modules. Some screenshots on the plugin page show features from all three editions.\u003C\u002Fp>\n\u003Ch3>A quick overview\u003C\u002Fh3>\n\u003Cp>\u003Ciframe loading=\"lazy\" title=\"Better robots.txt Video — AI-Ready Crawl Control for WordPress\" src=\"https:\u002F\u002Fplayer.vimeo.com\u002Fvideo\u002F1169756981?dnt=1&app_id=122963\" width=\"750\" height=\"372\" frameborder=\"0\" allow=\"autoplay; fullscreen; picture-in-picture; clipboard-write; encrypted-media; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\">\u003C\u002Fiframe>\u003C\u002Fp>\n\u003Ch3>Why Better Robots.txt is different\u003C\u002Fh3>\n\u003Cp>Most robots.txt plugins fall into one of three categories:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Simple text editor\u003C\u002Fli>\n\u003Cli>Virtual robots.txt manager\u003C\u002Fli>\n\u003Cli>Single-purpose AI or policy add-on\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>Better Robots.txt goes further.\u003C\u002Fp>\n\u003Cp>It gives you a complete, guided crawl control workflow so you can:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Choose a preset that matches your goals\u003C\u002Fli>\n\u003Cli>Control major crawler categories without writing everything by hand\u003C\u002Fli>\n\u003Cli>Keep core WordPress protection rules visible and editable\u003C\u002Fli>\n\u003Cli>Clean up low-value crawl paths that waste crawl budget\u003C\u002Fli>\n\u003Cli>Generate a cleaner robots.txt output\u003C\u002Fli>\n\u003Cli>Preview the final result before saving\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Ch3>What you can control\u003C\u002Fh3>\n\u003Cp>Better Robots.txt helps you manage:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Search engine visibility\u003C\u002Fli>\n\u003Cli>AI and LLM crawler behavior\u003C\u002Fli>\n\u003Cli>AI usage signals such as search, ai-input, and ai-train preferences\u003C\u002Fli>\n\u003Cli>SEO tool crawlers\u003C\u002Fli>\n\u003Cli>Bad bots and abusive crawlers\u003C\u002Fli>\n\u003Cli>Archive and Wayback access\u003C\u002Fli>\n\u003Cli>Feed crawlers and crawl traps\u003C\u002Fli>\n\u003Cli>WooCommerce crawl cleanup\u003C\u002Fli>\n\u003Cli>CSS, JavaScript, and image crawling rules\u003C\u002Fli>\n\u003Cli>Social media preview crawlers\u003C\u002Fli>\n\u003Cli>ads.txt and app-ads.txt allowance\u003C\u002Fli>\n\u003Cli>llms.txt generation\u003C\u002Fli>\n\u003Cli>Advanced directives such as crawl-delay and custom rules\u003C\u002Fli>\n\u003Cli>Final review before publishing\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Ch3>Editions\u003C\u002Fh3>\n\u003Cp>Better Robots.txt is available in three editions:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Free – Includes the guided setup, the Essential preset, core crawl control features, and the final Review & Save workflow.\u003C\u002Fli>\n\u003Cli>Pro – Adds more advanced governance and protection modules, including additional AI, crawler, and cleanup controls.\u003C\u002Fli>\n\u003Cli>Premium – Unlocks the most restrictive and advanced protection options, including the Fortress preset and additional high-control modules.\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>Some options shown in the interface are marked Free, Pro, or Premium so users can immediately understand which modules belong to each edition.\u003C\u002Fp>\n\u003Ch3>Presets\u003C\u002Fh3>\n\u003Cp>Setup starts with four modes:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Essential – A clean, practical configuration for most websites that want a better robots.txt without complexity.\u003C\u002Fli>\n\u003Cli>AI-First – For publishers and content sites that want AI-ready governance without shutting down discovery.\u003C\u002Fli>\n\u003Cli>Fortress – For websites that want stronger protection against scraping, archive capture, and unnecessary crawl activity.\u003C\u002Fli>\n\u003Cli>Custom – For users who prefer to configure each module manually.\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>For many sites, one preset plus a quick review is enough.\u003C\u002Fp>\n\u003Ch3>Built for beginners and experts\u003C\u002Fh3>\n\u003Cp>Beginners get:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>A guided setup instead of a raw robots.txt box\u003C\u002Fli>\n\u003Cli>Preset-based configuration\u003C\u002Fli>\n\u003Cli>Plain-language explanations for important choices\u003C\u002Fli>\n\u003Cli>A safer workflow with a final preview step\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>Advanced users get:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Editable core WordPress protection rules\u003C\u002Fli>\n\u003Cli>Fine-grained crawler controls by category\u003C\u002Fli>\n\u003Cli>WooCommerce-oriented cleanup options\u003C\u002Fli>\n\u003Cli>Consolidated output options\u003C\u002Fli>\n\u003Cli>Advanced directives and custom rules\u003C\u002Fli>\n\u003Cli>A final output they can inspect before publishing\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Ch3>AI-ready, without hype\u003C\u002Fh3>\n\u003Cp>Better Robots.txt includes features for modern AI-related crawl governance, including:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>AI crawler handling\u003C\u002Fli>\n\u003Cli>Optional llms.txt support\u003C\u002Fli>\n\u003Cli>AI usage signals for compliant systems\u003C\u002Fli>\n\u003Cli>Optional machine-readable governance signals for advanced use cases\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>These features help you express how you want automated systems to use your content.\u003C\u002Fp>\n\u003Cp>However, Better Robots.txt does not claim to control AI by force. Like robots.txt itself, these signals are most useful with compliant systems and good-faith crawlers.\u003C\u002Fp>\n\u003Ch3>What Better Robots.txt is\u003C\u002Fh3>\n\u003Cp>Better Robots.txt is:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>A robots.txt governance plugin for WordPress\u003C\u002Fli>\n\u003Cli>A guided configuration workflow instead of a raw text editor\u003C\u002Fli>\n\u003Cli>A crawl control layer to reduce wasteful crawling\u003C\u002Fli>\n\u003Cli>A practical bridge between SEO, crawl hygiene, and AI-era policy signaling\u003C\u002Fli>\n\u003Cli>A way to keep your crawl policy clearer for humans and machines\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>Technical reference for advanced users: Better Robots.txt also maintains a public \u003Ca href=\"https:\u002F\u002Fgithub.com\u002FGautierDorval\u002Fbetter-robots-txt\" rel=\"nofollow noopener noreferrer ugc\">GitHub repository\u003C\u002Fa> with product definition, governance notes, and machine-readable artefacts.\u003C\u002Fp>\n\u003Ch3>What Better Robots.txt is not\u003C\u002Fh3>\n\u003Cp>Better Robots.txt is not:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>A firewall or Web Application Firewall (WAF)\u003C\u002Fli>\n\u003Cli>An anti-scraping enforcement engine\u003C\u002Fli>\n\u003Cli>A legal compliance engine\u003C\u002Fli>\n\u003Cli>A guarantee that every bot will obey your rules\u003C\u002Fli>\n\u003Cli>A replacement for server-level security or access control\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>It helps you publish a clearer crawl policy.\u003C\u002Fp>\n\u003Cp>It does not replace infrastructure-level protection.\u003C\u002Fp>\n\u003Ch3>Typical use cases\u003C\u002Fh3>\n\u003Cp>Use Better Robots.txt if you want to:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Clean up a weak or noisy default robots.txt\u003C\u002Fli>\n\u003Cli>Reduce crawl waste on WordPress or WooCommerce\u003C\u002Fli>\n\u003Cli>Keep major search engines allowed while restricting other bots\u003C\u002Fli>\n\u003Cli>Control whether archive bots can snapshot your site\u003C\u002Fli>\n\u003Cli>Publish AI usage preferences more clearly\u003C\u002Fli>\n\u003Cli>Keep social preview bots allowed while limiting scrapers\u003C\u002Fli>\n\u003Cli>Review the final file before making it live\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Ch3>Key Features\u003C\u002Fh3>\n\u003Cul>\n\u003Cli>Guided step-by-step wizard\u003C\u002Fli>\n\u003Cli>Preset-based setup: Essential, AI-First, Fortress, Custom\u003C\u002Fli>\n\u003Cli>Search engine visibility controls\u003C\u002Fli>\n\u003Cli>AI and LLM crawler governance\u003C\u002Fli>\n\u003Cli>AI usage signals support\u003C\u002Fli>\n\u003Cli>SEO tool crawler controls\u003C\u002Fli>\n\u003Cli>Bad bot and abusive crawler options\u003C\u002Fli>\n\u003Cli>Archive and Wayback access controls\u003C\u002Fli>\n\u003Cli>Spam, feed, and crawl trap cleanup\u003C\u002Fli>\n\u003Cli>WooCommerce crawl cleanup options\u003C\u002Fli>\n\u003Cli>CSS, JavaScript, and image crawling rules\u003C\u002Fli>\n\u003Cli>Social media preview crawler controls\u003C\u002Fli>\n\u003Cli>ads.txt and app-ads.txt allowance\u003C\u002Fli>\n\u003Cli>Optional llms.txt generation\u003C\u002Fli>\n\u003Cli>Consolidated output option\u003C\u002Fli>\n\u003Cli>Core WordPress protection rules remain visible and editable\u003C\u002Fli>\n\u003Cli>Final Review & Save preview screen\u003C\u002Fli>\n\u003C\u002Ful>\n","Replace the default WordPress robots.txt workflow with a smarter, structured version you can preview before publishing, with Free, Pro, and Premium ed &hellip;",6000,305034,90,102,"2026-03-10T18:33:00.000Z","5.0","7.4",[54,55,56,22,57],"ai-crawlers","bot-blocker","llms-txt","seo","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Fbetter-robots-txt.3.0.0.zip",99,2,"2023-02-14 00:00:00",{"slug":63,"name":64,"version":65,"author":66,"author_profile":67,"description":68,"short_description":69,"active_installs":13,"downloaded":70,"rating":13,"num_ratings":13,"last_updated":71,"tested_up_to":15,"requires_at_least":72,"requires_php":73,"tags":74,"homepage":79,"download_link":80,"security_score":26,"vuln_count":13,"unpatched_count":13,"last_vuln_date":27,"fetched_at":28},"wimb-and-block","Block old browser versions and suspicious browsers","1.4","hupe13","https:\u002F\u002Fprofiles.wordpress.org\u002Fhupe13\u002F","\u003Cp>Every time your web browser makes a request to a website, it sends a HTTP Header called the “User Agent”. The User Agent string contains information about your web browser name, operating system, device type and lots of other useful bits of information.\u003C\u002Fp>\n\u003Cp>The plugin sends with an API the User Agent string of every browser that accesses your website for the first time to \u003Ca href=\"https:\u002F\u002Fapi.whatismybrowser.com\u002Fapi\u002Fv2\u002Fuser_agent_parse\" rel=\"nofollow ugc\">https:\u002F\u002Fapi.whatismybrowser.com\u002Fapi\u002Fv2\u002Fuser_agent_parse\u003C\u002Fa> to obtain following information about the User Agent:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Software Name & Version\u003C\u002Fli>\n\u003Cli>Operating System Name & Version\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>\u003Ca href=\"https:\u002F\u002Fdevelopers.whatismybrowser.com\u002Fapi\u002Fabout\u002Flegal\u002F\" rel=\"nofollow ugc\">WhatIsMyBrowser.com API Terms and Conditions\u003C\u002Fa>\u003C\u002Fp>\n\u003Cp>With this information, the plugin attempts to detect old and bad browsers and denies them access to your website.\u003C\u002Fp>\n\u003Ch4>HowTo\u003C\u002Fh4>\n\u003Cul>\n\u003Cli>Go to \u003Ca href=\"https:\u002F\u002Fdevelopers.whatismybrowser.com\u002Fapi\u002Fpricing\u002F\" rel=\"nofollow ugc\">What is my browser?\u003C\u002Fa> and sign up to the WhatIsMyBrowser.com API for a Basic (free) Application Plan.\u003C\u002Fli>\n\u003Cli>You have a limit of 5000 hits \u002F month for Parsing User Agent. That’s why the plugin manages a database table.\u003C\u002Fli>\n\u003Cli>The user agent string of every browser that accesses your website for the first time is sent to this service, and the information is stored a table.\u003C\u002Fli>\n\u003Cli>Browsers are blocked if the browser and\u002For system are outdated:\n\u003Cul>\n\u003Cli>Default: Chrome, Edge and Chrome based browsers \u003C 139, Firefox browsers \u003C 140, Safari \u003C 18, Samsung Browser \u003C 28, Internet Explorer, Netscape (!)\u003C\u002Fli>\n\u003Cli>Old systems are all Windows versions prior to Windows 10, MacOS prior to Catalina and Android versions \u003C 10 and Symbian.\u003C\u002Fli>\n\u003C\u002Ful>\n\u003C\u002Fli>\n\u003Cli>It will be blocked also if “Software” contains “unknown” or is empty.\u003C\u002Fli>\n\u003Cli>You can also set up other browsers.\u003C\u002Fli>\n\u003Cli>Sometimes there are false positive, for example, if the browser is from Mastodon. In this case, you can exclude it from the check.\u003C\u002Fli>\n\u003Cli>The plugin checks whether the crawlers really originate from Google, Bing, Yandex, Apple, Mojeek, Baidu, Seznam.\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Ch4>About robots.txt\u003C\u002Fh4>\n\u003Cul>\n\u003Cli>You can configure some rewrite rules to provide a robots.txt file that can allow or deny crawling for a browser. If crawling is denied, access to your website will be blocked for that browser.\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Ch4>Logging\u003C\u002Fh4>\n\u003Cul>\n\u003Cli>The logging can be very detailed. Please check the logs and the WIMB table regularly.\u003C\u002Fli>\n\u003C\u002Ful>\n","With the help of WhatIsMyBrowser the plugin detects old and bad browsers and denies them access. A special robots.txt denies crawling by bad bots.",363,"2026-02-23T19:43:00.000Z","6.2","8.1",[75,76,77,22,78],"bad-bots","ban","blocking","security","https:\u002F\u002Fleafext.de\u002Fhp\u002Fwimb\u002F","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Fwimb-and-block.1.4.zip",{"slug":82,"name":83,"version":84,"author":85,"author_profile":86,"description":87,"short_description":88,"active_installs":89,"downloaded":90,"rating":91,"num_ratings":92,"last_updated":93,"tested_up_to":15,"requires_at_least":51,"requires_php":24,"tags":94,"homepage":95,"download_link":96,"security_score":26,"vuln_count":13,"unpatched_count":13,"last_vuln_date":27,"fetched_at":28},"pc-robotstxt","Virtual Robots.txt","1.10","Marios Alexandrou","https:\u002F\u002Fprofiles.wordpress.org\u002Fmarios-alexandrou\u002F","\u003Cp>Virtual Robots.txt is an easy (i.e. automated) solution to creating and managing a robots.txt file for your site. Instead of mucking about with FTP, files, permissions ..etc, just upload and activate the plugin and you’re done.\u003C\u002Fp>\n\u003Cp>By default, the Virtual Robots.txt plugin allows access to the parts of WordPress that good bots like Google need to access. Other parts are blocked.\u003C\u002Fp>\n\u003Cp>If the plugin detects an existing XML sitemap file, a reference to it will be automatically added to your robots.txt file.\u003C\u002Fp>\n","Virtual Robots.txt automatically creates a robots.txt file for your site. Your robots.txt file can be easily edited from the plugin settings page.",50000,441168,86,10,"2025-12-29T14:20:00.000Z",[21,22],"http:\u002F\u002Finfolific.com\u002Ftechnology\u002Fsoftware-worth-using\u002Frobots-txt-plugin-for-wordpress","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Fpc-robotstxt.zip",{"slug":98,"name":99,"version":100,"author":101,"author_profile":102,"description":103,"short_description":104,"active_installs":89,"downloaded":105,"rating":26,"num_ratings":106,"last_updated":107,"tested_up_to":108,"requires_at_least":109,"requires_php":110,"tags":111,"homepage":112,"download_link":113,"security_score":26,"vuln_count":13,"unpatched_count":13,"last_vuln_date":27,"fetched_at":28},"wp-robots-txt","WP Robots Txt","1.3.5","George Pattichis","https:\u002F\u002Fprofiles.wordpress.org\u002Fpattihis\u002F","\u003Cp>WordPress, by default, includes a simple robots.txt file that’s dynamically generated from within the WP application. This is great, but how do you easily change the content?\u003C\u002Fp>\n\u003Cp>Enter \u003Cstrong>WP Robots Txt\u003C\u002Fstrong>, a plugin that adds an additional field to the “Reading” admin page where you can do just that. No manual coding or file editing required!\u003C\u002Fp>\n\u003Cp>Simply visit https:\u002F\u002Fyour-site.com\u002Fwp-admin\u002Foptions-reading.php and you can control the contents of your https:\u002F\u002Fyour-site.com\u002Frobots.txt\u003C\u002Fp>\n\u003Cp>\u003Ca href=\"https:\u002F\u002Fwordpress.org\u002Fplugins\u002Fwp-robots-txt\u002F#developers\" rel=\"ugc\">Changelog\u003C\u002Fa>\u003C\u002Fp>\n","WP Robots Txt Allows you to edit the content of your robots.txt file.",545169,21,"2025-06-29T19:37:00.000Z","6.8.5","5.3.0","7.0",[21,22,57],"https:\u002F\u002Fgithub.com\u002Fpattihis\u002Fwp-robots.txt","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Fwp-robots-txt.1.3.5.zip",{"slug":115,"name":116,"version":117,"author":118,"author_profile":119,"description":120,"short_description":121,"active_installs":122,"downloaded":123,"rating":48,"num_ratings":124,"last_updated":125,"tested_up_to":126,"requires_at_least":127,"requires_php":17,"tags":128,"homepage":24,"download_link":130,"security_score":131,"vuln_count":13,"unpatched_count":13,"last_vuln_date":27,"fetched_at":28},"robots-txt-editor","Robots.txt Editor","1.1.4","Processby","https:\u002F\u002Fprofiles.wordpress.org\u002Fprocessby\u002F","\u003Cp>The plugin allows you to create and edit the robots.txt file on your site.\u003C\u002Fp>\n\u003Ch4>Features\u003C\u002Fh4>\n\u003Cul>\n\u003Cli>Works with multisite network on Subdomains;\u003C\u002Fli>\n\u003Cli>An example of the correct file for WordPress;\u003C\u002Fli>\n\u003Cli>Works out of the box;\u003C\u002Fli>\n\u003Cli>Totally Free.\u003C\u002Fli>\n\u003C\u002Ful>\n","Robots.txt for WordPress",10000,111434,8,"2021-01-16T00:07:00.000Z","5.6.17","4.0",[129,21,22,57],"crawler","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Frobots-txt-editor.zip",85,{"attackSurface":133,"codeSignals":154,"taintFlows":161,"riskAssessment":162,"analyzedAt":165},{"hooks":134,"ajaxHandlers":150,"restRoutes":151,"shortcodes":152,"cronEvents":153,"entryPointCount":13,"unprotectedCount":13},[135,142,146],{"type":136,"name":137,"callback":138,"priority":139,"file":140,"line":141},"action","admin_bar_menu","apa_search_engine_blocked_warning",9999,"search-engine-blocked-warning.php",30,{"type":136,"name":143,"callback":144,"file":140,"line":145},"admin_head","apa_search_engine_blocked_warning_css",40,{"type":136,"name":147,"callback":148,"file":140,"line":149},"plugins_loaded","apa_search_engine_blocked_warning_f_init",63,[],[],[],[],{"dangerousFunctions":155,"sqlUsage":156,"outputEscaping":158,"fileOperations":13,"externalRequests":13,"nonceChecks":13,"capabilityChecks":13,"bundledLibraries":160},[],{"prepared":13,"raw":13,"locations":157},[],{"escaped":13,"rawEcho":13,"locations":159},[],[],[],{"summary":163,"deductions":164},"The \"search-engines-blocked-warning\" plugin, version 1.0.0, exhibits a very strong security posture based on the provided static analysis.  The absence of any identified dangerous functions, raw SQL queries, unsanitized output, file operations, or external HTTP requests is highly commendable. Furthermore, the lack of any detected taint flows, especially those with unsanitized paths or critical\u002Fhigh severity, suggests robust code hygiene.  The plugin also demonstrates a secure approach by not directly exposing entry points like AJAX handlers, REST API routes, or shortcodes without proper authorization mechanisms, indicated by zero unprotected entry points.\n\nThe vulnerability history is equally impressive, showing no recorded CVEs at any severity level. This suggests a history of stable and secure development for this plugin.  The complete absence of common vulnerability types and recent issues further reinforces this positive trend.  While the current data points to an exceptionally secure plugin, the complete absence of nonce and capability checks across all potential entry points (even though there are none identified) could be a theoretical area of concern if the attack surface were to expand in future versions.  However, based solely on the current analysis, the plugin appears to be exceptionally well-secured.",[],"2026-03-16T19:39:07.597Z",{"wat":167,"direct":172},{"assetPaths":168,"generatorPatterns":169,"scriptPaths":170,"versionParams":171},[],[],[],[],{"cssClasses":173,"htmlComments":175,"htmlAttributes":176,"restEndpoints":177,"jsGlobals":178,"shortcodeOutput":179},[174],"dashicons-hidden",[],[],[],[],[]]