[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"$fqQ9JVoRWbGIN4jisx7sOTzcXitrRrhLI2NJR_0nnhlM":3},{"slug":4,"name":5,"version":6,"author":7,"author_profile":8,"description":9,"short_description":10,"active_installs":11,"downloaded":12,"rating":11,"num_ratings":13,"last_updated":14,"tested_up_to":15,"requires_at_least":16,"requires_php":17,"tags":18,"homepage":24,"download_link":25,"security_score":26,"vuln_count":27,"unpatched_count":27,"last_vuln_date":28,"fetched_at":29,"vulnerabilities":30,"developer":31,"crawl_stats":28,"alternatives":36,"analysis":144,"fingerprints":250},"meta-robots-by-seo-sign","MetaRobots by SEO-Sign","1.0.0","Artem Pilipets","https:\u002F\u002Fprofiles.wordpress.org\u002Fartem-pilipets\u002F","\u003Cp>Manage prohibitions meta robots and robots.txt file from the control panel using the commands in the format of robots.txt\u003C\u002Fp>\n\u003Cp>PHP script parses commands from the metarobots.txt file and creates relevant tags.\u003C\u002Fp>\n\u003Ch4>Currently script supports following commands:\u003C\u002Fh4>\n\u003Cul>\n\u003Cli>\n\u003Cp>Disallow:\u003Cbr \u002F>\nname=”robots” content=”noindex, nofollow”\u003Cbr \u002F>\nBlocks content from crawling for the search engine.\u003C\u002Fp>\n\u003C\u002Fli>\n\u003Cli>\n\u003Cp>Index:\u003Cbr \u002F>\nname=”robots” content=”index, nofollow”\u003Cbr \u002F>\nAllows crawling, forbids follow the links.\u003C\u002Fp>\n\u003C\u002Fli>\n\u003Cli>\n\u003Cp>Follow:\u003Cbr \u002F>\nname=”robots” content=”noindex, follow”\u003Cbr \u002F>\nForbids crawling, allows forbids follow the links.\u003C\u002Fp>\n\u003C\u002Fli>\n\u003Cli>\n\u003Cp>Noarchiv:\u003Cbr \u002F>\nname=”robots” content=”noarchive”\u003Cbr \u002F>\nDo not show a link to the cached copy.\u003C\u002Fp>\n\u003C\u002Fli>\n\u003Cli>\n\u003Cp>Nosnippet:\u003Cbr \u002F>\nname=”robots” content=”nosnippet”\u003Cbr \u002F>\nDo not create a snippet.\u003C\u002Fp>\n\u003C\u002Fli>\n\u003Cli>\n\u003Cp>Noodp:\u003Cbr \u002F>\nname=”robots” content=”noodp”\u003Cbr \u002F>\nDo not use the description from DMOZ for snippet.\u003C\u002Fp>\n\u003C\u002Fli>\n\u003Cli>\n\u003Cp>Notranslate:\u003Cbr \u002F>\nname=”robots” content=”notranslate”\u003Cbr \u002F>\nDo not offer page translation.\u003C\u002Fp>\n\u003C\u002Fli>\n\u003Cli>\n\u003Cp>Noimageindex:\u003Cbr \u002F>\nname=”robots” content=”noimageindex”\u003Cbr \u002F>\nDo not crawl the image on the page.\u003C\u002Fp>\n\u003C\u002Fli>\n\u003C\u002Ful>\n","The easiest way to manage meta robots tag.",100,4693,1,"2015-04-07T19:41:00.000Z","4.1.42","3.0.1","",[19,20,21,22,23],"crawlers","editor","google","meta-robots","robots-txt","http:\u002F\u002Fwww.seo-sign.com\u002Fp\u002Fsimple-ways-to-manage-meta-robots-tag.html","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Fmeta-robots-by-seo-sign.zip",85,0,null,"2026-03-15T15:16:48.613Z",[],{"slug":32,"display_name":7,"profile_url":8,"plugin_count":13,"total_installs":11,"avg_security_score":26,"avg_patch_time_days":33,"trust_score":34,"computed_at":35},"artem-pilipets",30,84,"2026-04-03T21:15:31.956Z",[37,60,81,104,125],{"slug":38,"name":39,"version":40,"author":41,"author_profile":42,"description":43,"short_description":44,"active_installs":45,"downloaded":46,"rating":11,"num_ratings":47,"last_updated":48,"tested_up_to":49,"requires_at_least":50,"requires_php":51,"tags":52,"homepage":58,"download_link":59,"security_score":11,"vuln_count":27,"unpatched_count":27,"last_vuln_date":28,"fetched_at":29},"microthemer","Microthemer Lite – Visual Editor to Customize CSS","7.5.3.7","Themeover","https:\u002F\u002Fprofiles.wordpress.org\u002Fbastywebb\u002F","\u003Cp>A light-weight yet powerful visual editor to customize the CSS styling of any aspect of your site, from Google fonts to responsive layouts. Microthemer caters for both coders and non-coders, and plays really well with page builders like Elementor, Beaver Builder, and Oxygen.\u003C\u002Fp>\n\u003Ch4>Feature list\u003C\u002Fh4>\n\u003Col>\n\u003Cli>Style anything\u003C\u002Fli>\n\u003Cli>Use with any theme or plugin\u003C\u002Fli>\n\u003Cli>Point & click visual styling\u003C\u002Fli>\n\u003Cli>Code editor (CSS, Sass, JS)\u003C\u002Fli>\n\u003Cli>Sync code editor with the UI\u003C\u002Fli>\n\u003Cli>Customisable breakpoints\u003C\u002Fli>\n\u003Cli>HTML and CSS inspection\u003C\u002Fli>\n\u003Cli>150+ CSS properties\u003C\u002Fli>\n\u003Cli>Dark or light theme\u003C\u002Fli>\n\u003Cli>Custom toolbar layouts\u003C\u002Fli>\n\u003Cli>Work with any CSS unit\u003C\u002Fli>\n\u003Cli>Color picker with palettes\u003C\u002Fli>\n\u003Cli>Slider, mousewheel, keyboard adjustments\u003C\u002Fli>\n\u003Cli>In-program CSS reference\u003C\u002Fli>\n\u003Cli>History\u003C\u002Fli>\n\u003Cli>Draft mode\u003C\u002Fli>\n\u003Cli>Global or page-specific styling\u003C\u002Fli>\n\u003Cli>Import & export\u003C\u002Fli>\n\u003Cli>Light-weight\u003C\u002Fli>\n\u003Cli>Minify CSS code\u003C\u002Fli>\n\u003Cli>Keyboard shortcuts\u003C\u002Fli>\n\u003Cli>Deep integration with Elementor, Beaver Builder, Oxygen\u003C\u002Fli>\n\u003Cli>Multisite support\u003C\u002Fli>\n\u003Cli>Uninstall MT, but keep your edits\u003C\u002Fli>\n\u003Cli>\u003Cstrong>[Pro]\u003C\u002Fstrong> CSS grid (drag & drop)\u003C\u002Fli>\n\u003Cli>\u003Cstrong>[Pro]\u003C\u002Fstrong> Flexbox\u003C\u002Fli>\n\u003Cli>\u003Cstrong>[Pro]\u003C\u002Fstrong> Stock SVG mask images\u003C\u002Fli>\n\u003Cli>\u003Cstrong>[Pro]\u003C\u002Fstrong> Transform\u003C\u002Fli>\n\u003Cli>\u003Cstrong>[Pro]\u003C\u002Fstrong> Animation\u003C\u002Fli>\n\u003Cli>\u003Cstrong>[Pro]\u003C\u002Fstrong> Transition\u003C\u002Fli>\n\u003C\u002Fol>\n\u003Ch4>Lite VS Pro\u003C\u002Fh4>\n\u003Cp>This lite version limits you styling 15 things, and doesn’t include the features marked [Pro] in the list above. To unlock the full program, you can \u003Ca href=\"https:\u002F\u002Fthemeover.com\u002F\" rel=\"nofollow ugc\">purchase a license\u003C\u002Fa> (monthly, annual, or lifetime).\u003C\u002Fp>\n\u003Ch4>Useful links\u003C\u002Fh4>\n\u003Cul>\n\u003Cli>\u003Ca href=\"https:\u002F\u002Fthemeover.com\u002F\" rel=\"nofollow ugc\">Website\u003C\u002Fa>\u003C\u002Fli>\n\u003Cli>\u003Ca href=\"https:\u002F\u002Fthemeover.com\u002Fintroducing-microthemer-7\u002F\" rel=\"nofollow ugc\">Video docs\u003C\u002Fa>\u003C\u002Fli>\n\u003Cli>\u003Ca href=\"https:\u002F\u002Flivedemo.themeover.com\u002Fsetting-up-demo-site\u002F?create_demo\" rel=\"nofollow ugc\">Live demo\u003C\u002Fa>\u003C\u002Fli>\n\u003Cli>\u003Ca href=\"https:\u002F\u002Fthemeover.com\u002Fforum\u002F\" rel=\"nofollow ugc\">Support forum\u003C\u002Fa>\u003C\u002Fli>\n\u003Cli>\u003Ca href=\"https:\u002F\u002Fwww.facebook.com\u002Fgroups\u002Fmicrothemer\" rel=\"nofollow ugc\">Facebook group\u003C\u002Fa>\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Ch4>Author note\u003C\u002Fh4>\n\u003Cp>Hello everyone, my name is Sebastian. I’ve designed Microthemer for developers as well as beginners. My aim is to level up beginners by exposing the CSS code Microthemer generates when using the visual controls. This is of course helpful for developers who may wish to make manual edits. Some developers use Microthemer as an in-browser CSS or Sass editor, and just lean on the interface for element selection or more advanced properties like filters, grid, and animation.\u003C\u002Fp>\n\u003Cp>I’ve been happily developing Microthemer and supporting users of varying technical experience in my forum for many years now. I’m always ready to answer questions about the software and help out with CSS hurdles. Please don’t hesitate to get in touch!\u003C\u002Fp>\n","A visual editor to customize the CSS styling of anything on your site - from Google fonts to responsive layouts.",10000,2608688,44,"2026-03-04T14:30:00.000Z","6.9.4","6.0","5.6",[53,54,55,56,57],"css","customize","google-fonts","responsive","visual-editor","https:\u002F\u002Fthemeover.com\u002Fmicrothemer","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Fmicrothemer.zip",{"slug":61,"name":62,"version":63,"author":64,"author_profile":65,"description":66,"short_description":67,"active_installs":68,"downloaded":69,"rating":70,"num_ratings":71,"last_updated":72,"tested_up_to":73,"requires_at_least":74,"requires_php":17,"tags":75,"homepage":79,"download_link":80,"security_score":11,"vuln_count":27,"unpatched_count":27,"last_vuln_date":28,"fetched_at":29},"fonts","Fonts","3.0","WP SITES","https:\u002F\u002Fprofiles.wordpress.org\u002Fwordpresssites\u002F","\u003Cp>This plugin adds 2 drop down menus to your visual editor with additional sizes and fonts:\u003C\u002Fp>\n\u003Col>\n\u003Cli>A button for Styles\u003C\u002Fli>\n\u003Cli>A button for Sizes\u003C\u002Fli>\n\u003C\u002Fol>\n\u003Cp>New: You can also add your own selection of \u003Ca href=\"https:\u002F\u002Fwpsites.net\u002Fproduct\u002Fcustom-fonts-for-your-visual-editor-in-wordpress\u002F\" rel=\"nofollow ugc\">Google & Custom fonts\u003C\u002Fa> including premium fonts, to your editor by upgrading to Fonts Pro.\u003C\u002Fp>\n\u003Cp>\u003Cstrong>Fonts Pro:\u003C\u002Fstrong>\u003C\u002Fp>\n\u003Col>\n\u003Cli>\u003Ca href=\"https:\u002F\u002Fwpsites.net\u002Fproduct\u002Fcustom-fonts-for-your-visual-editor-in-wordpress\u002F\" rel=\"nofollow ugc\">Fonts Pro\u003C\u002Fa>\u003C\u002Fli>\n\u003C\u002Fol>\n\u003Ch3>Support\u003C\u002Fh3>\n\u003Cp>New \u003Ca href=\"https:\u002F\u002Fwpsites.net\u002Fcontact\" rel=\"nofollow ugc\">Support\u003C\u002Fa>\u003C\u002Fp>\n\u003Ch3>Gutenberg\u003C\u002Fh3>\n\u003Cp>Works with the latest version of Gutenberg\u003C\u002Fp>\n","Add More Font To Your WordPress Editor",9000,310340,88,99,"2025-10-01T12:05:00.000Z","6.8.5","4.0",[76,77,78,61,55],"custom-fonts","editor-fonts","font-plugin","https:\u002F\u002Fwpsites.net\u002Fbest-plugins\u002Fplugin-fonts-styles-sizes-wordpress\u002F","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Ffonts.3.0.zip",{"slug":82,"name":83,"version":84,"author":85,"author_profile":86,"description":87,"short_description":88,"active_installs":89,"downloaded":90,"rating":91,"num_ratings":92,"last_updated":93,"tested_up_to":49,"requires_at_least":94,"requires_php":95,"tags":96,"homepage":17,"download_link":101,"security_score":71,"vuln_count":102,"unpatched_count":27,"last_vuln_date":103,"fetched_at":29},"better-robots-txt","Better Robots.txt – AI-Ready Crawl Control & Bot Governance","3.0.0","Pagup","https:\u002F\u002Fprofiles.wordpress.org\u002Fpagup\u002F","\u003Cp>Better Robots.txt replaces the default WordPress robots.txt workflow with a smarter, structured version you can configure and preview before publishing.\u003C\u002Fp>\n\u003Cp>Instead of a blank textarea, you get a guided wizard with presets, plain-language explanations, and a final Review & Save step so you can inspect the generated robots.txt before it goes live.\u003C\u002Fp>\n\u003Cp>Built for beginners and advanced users alike, Better Robots.txt helps you control how search engines, AI crawlers, SEO tools, archive bots, bad bots, social preview bots, and other automated agents interact with your site.\u003C\u002Fp>\n\u003Cp>Trusted by thousands of WordPress sites, Better Robots.txt is designed for the AI era without resorting to hype, vague promises, or hidden rules.\u003C\u002Fp>\n\u003Cp>Better Robots.txt is available in Free, Pro, and Premium editions. The free plugin covers the guided workflow and essential crawl control features, while Pro and Premium unlock additional governance, protection, and AI-ready modules. Some screenshots on the plugin page show features from all three editions.\u003C\u002Fp>\n\u003Ch3>A quick overview\u003C\u002Fh3>\n\u003Cp>\u003Ciframe loading=\"lazy\" title=\"Better robots.txt Video — AI-Ready Crawl Control for WordPress\" src=\"https:\u002F\u002Fplayer.vimeo.com\u002Fvideo\u002F1169756981?dnt=1&app_id=122963\" width=\"750\" height=\"372\" frameborder=\"0\" allow=\"autoplay; fullscreen; picture-in-picture; clipboard-write; encrypted-media; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\">\u003C\u002Fiframe>\u003C\u002Fp>\n\u003Ch3>Why Better Robots.txt is different\u003C\u002Fh3>\n\u003Cp>Most robots.txt plugins fall into one of three categories:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Simple text editor\u003C\u002Fli>\n\u003Cli>Virtual robots.txt manager\u003C\u002Fli>\n\u003Cli>Single-purpose AI or policy add-on\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>Better Robots.txt goes further.\u003C\u002Fp>\n\u003Cp>It gives you a complete, guided crawl control workflow so you can:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Choose a preset that matches your goals\u003C\u002Fli>\n\u003Cli>Control major crawler categories without writing everything by hand\u003C\u002Fli>\n\u003Cli>Keep core WordPress protection rules visible and editable\u003C\u002Fli>\n\u003Cli>Clean up low-value crawl paths that waste crawl budget\u003C\u002Fli>\n\u003Cli>Generate a cleaner robots.txt output\u003C\u002Fli>\n\u003Cli>Preview the final result before saving\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Ch3>What you can control\u003C\u002Fh3>\n\u003Cp>Better Robots.txt helps you manage:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Search engine visibility\u003C\u002Fli>\n\u003Cli>AI and LLM crawler behavior\u003C\u002Fli>\n\u003Cli>AI usage signals such as search, ai-input, and ai-train preferences\u003C\u002Fli>\n\u003Cli>SEO tool crawlers\u003C\u002Fli>\n\u003Cli>Bad bots and abusive crawlers\u003C\u002Fli>\n\u003Cli>Archive and Wayback access\u003C\u002Fli>\n\u003Cli>Feed crawlers and crawl traps\u003C\u002Fli>\n\u003Cli>WooCommerce crawl cleanup\u003C\u002Fli>\n\u003Cli>CSS, JavaScript, and image crawling rules\u003C\u002Fli>\n\u003Cli>Social media preview crawlers\u003C\u002Fli>\n\u003Cli>ads.txt and app-ads.txt allowance\u003C\u002Fli>\n\u003Cli>llms.txt generation\u003C\u002Fli>\n\u003Cli>Advanced directives such as crawl-delay and custom rules\u003C\u002Fli>\n\u003Cli>Final review before publishing\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Ch3>Editions\u003C\u002Fh3>\n\u003Cp>Better Robots.txt is available in three editions:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Free – Includes the guided setup, the Essential preset, core crawl control features, and the final Review & Save workflow.\u003C\u002Fli>\n\u003Cli>Pro – Adds more advanced governance and protection modules, including additional AI, crawler, and cleanup controls.\u003C\u002Fli>\n\u003Cli>Premium – Unlocks the most restrictive and advanced protection options, including the Fortress preset and additional high-control modules.\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>Some options shown in the interface are marked Free, Pro, or Premium so users can immediately understand which modules belong to each edition.\u003C\u002Fp>\n\u003Ch3>Presets\u003C\u002Fh3>\n\u003Cp>Setup starts with four modes:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Essential – A clean, practical configuration for most websites that want a better robots.txt without complexity.\u003C\u002Fli>\n\u003Cli>AI-First – For publishers and content sites that want AI-ready governance without shutting down discovery.\u003C\u002Fli>\n\u003Cli>Fortress – For websites that want stronger protection against scraping, archive capture, and unnecessary crawl activity.\u003C\u002Fli>\n\u003Cli>Custom – For users who prefer to configure each module manually.\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>For many sites, one preset plus a quick review is enough.\u003C\u002Fp>\n\u003Ch3>Built for beginners and experts\u003C\u002Fh3>\n\u003Cp>Beginners get:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>A guided setup instead of a raw robots.txt box\u003C\u002Fli>\n\u003Cli>Preset-based configuration\u003C\u002Fli>\n\u003Cli>Plain-language explanations for important choices\u003C\u002Fli>\n\u003Cli>A safer workflow with a final preview step\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>Advanced users get:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Editable core WordPress protection rules\u003C\u002Fli>\n\u003Cli>Fine-grained crawler controls by category\u003C\u002Fli>\n\u003Cli>WooCommerce-oriented cleanup options\u003C\u002Fli>\n\u003Cli>Consolidated output options\u003C\u002Fli>\n\u003Cli>Advanced directives and custom rules\u003C\u002Fli>\n\u003Cli>A final output they can inspect before publishing\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Ch3>AI-ready, without hype\u003C\u002Fh3>\n\u003Cp>Better Robots.txt includes features for modern AI-related crawl governance, including:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>AI crawler handling\u003C\u002Fli>\n\u003Cli>Optional llms.txt support\u003C\u002Fli>\n\u003Cli>AI usage signals for compliant systems\u003C\u002Fli>\n\u003Cli>Optional machine-readable governance signals for advanced use cases\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>These features help you express how you want automated systems to use your content.\u003C\u002Fp>\n\u003Cp>However, Better Robots.txt does not claim to control AI by force. Like robots.txt itself, these signals are most useful with compliant systems and good-faith crawlers.\u003C\u002Fp>\n\u003Ch3>What Better Robots.txt is\u003C\u002Fh3>\n\u003Cp>Better Robots.txt is:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>A robots.txt governance plugin for WordPress\u003C\u002Fli>\n\u003Cli>A guided configuration workflow instead of a raw text editor\u003C\u002Fli>\n\u003Cli>A crawl control layer to reduce wasteful crawling\u003C\u002Fli>\n\u003Cli>A practical bridge between SEO, crawl hygiene, and AI-era policy signaling\u003C\u002Fli>\n\u003Cli>A way to keep your crawl policy clearer for humans and machines\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>Technical reference for advanced users: Better Robots.txt also maintains a public \u003Ca href=\"https:\u002F\u002Fgithub.com\u002FGautierDorval\u002Fbetter-robots-txt\" rel=\"nofollow noopener noreferrer ugc\">GitHub repository\u003C\u002Fa> with product definition, governance notes, and machine-readable artefacts.\u003C\u002Fp>\n\u003Ch3>What Better Robots.txt is not\u003C\u002Fh3>\n\u003Cp>Better Robots.txt is not:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>A firewall or Web Application Firewall (WAF)\u003C\u002Fli>\n\u003Cli>An anti-scraping enforcement engine\u003C\u002Fli>\n\u003Cli>A legal compliance engine\u003C\u002Fli>\n\u003Cli>A guarantee that every bot will obey your rules\u003C\u002Fli>\n\u003Cli>A replacement for server-level security or access control\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>It helps you publish a clearer crawl policy.\u003C\u002Fp>\n\u003Cp>It does not replace infrastructure-level protection.\u003C\u002Fp>\n\u003Ch3>Typical use cases\u003C\u002Fh3>\n\u003Cp>Use Better Robots.txt if you want to:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Clean up a weak or noisy default robots.txt\u003C\u002Fli>\n\u003Cli>Reduce crawl waste on WordPress or WooCommerce\u003C\u002Fli>\n\u003Cli>Keep major search engines allowed while restricting other bots\u003C\u002Fli>\n\u003Cli>Control whether archive bots can snapshot your site\u003C\u002Fli>\n\u003Cli>Publish AI usage preferences more clearly\u003C\u002Fli>\n\u003Cli>Keep social preview bots allowed while limiting scrapers\u003C\u002Fli>\n\u003Cli>Review the final file before making it live\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Ch3>Key Features\u003C\u002Fh3>\n\u003Cul>\n\u003Cli>Guided step-by-step wizard\u003C\u002Fli>\n\u003Cli>Preset-based setup: Essential, AI-First, Fortress, Custom\u003C\u002Fli>\n\u003Cli>Search engine visibility controls\u003C\u002Fli>\n\u003Cli>AI and LLM crawler governance\u003C\u002Fli>\n\u003Cli>AI usage signals support\u003C\u002Fli>\n\u003Cli>SEO tool crawler controls\u003C\u002Fli>\n\u003Cli>Bad bot and abusive crawler options\u003C\u002Fli>\n\u003Cli>Archive and Wayback access controls\u003C\u002Fli>\n\u003Cli>Spam, feed, and crawl trap cleanup\u003C\u002Fli>\n\u003Cli>WooCommerce crawl cleanup options\u003C\u002Fli>\n\u003Cli>CSS, JavaScript, and image crawling rules\u003C\u002Fli>\n\u003Cli>Social media preview crawler controls\u003C\u002Fli>\n\u003Cli>ads.txt and app-ads.txt allowance\u003C\u002Fli>\n\u003Cli>Optional llms.txt generation\u003C\u002Fli>\n\u003Cli>Consolidated output option\u003C\u002Fli>\n\u003Cli>Core WordPress protection rules remain visible and editable\u003C\u002Fli>\n\u003Cli>Final Review & Save preview screen\u003C\u002Fli>\n\u003C\u002Ful>\n","Replace the default WordPress robots.txt workflow with a smarter, structured version you can preview before publishing, with Free, Pro, and Premium ed &hellip;",6000,305034,90,102,"2026-03-10T18:33:00.000Z","5.0","7.4",[97,98,99,23,100],"ai-crawlers","bot-blocker","llms-txt","seo","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Fbetter-robots-txt.3.0.0.zip",2,"2023-02-14 00:00:00",{"slug":105,"name":106,"version":107,"author":108,"author_profile":109,"description":110,"short_description":111,"active_installs":112,"downloaded":113,"rating":114,"num_ratings":115,"last_updated":116,"tested_up_to":117,"requires_at_least":118,"requires_php":17,"tags":119,"homepage":123,"download_link":124,"security_score":26,"vuln_count":27,"unpatched_count":27,"last_vuln_date":28,"fetched_at":29},"multipart-robotstxt-editor","Multipart robots.txt editor","0.4.0","Viktor Szépe","https:\u002F\u002Fprofiles.wordpress.org\u002Fszepeviktor\u002F","\u003Ch4>This plugin needs more documentation!\u003C\u002Fh4>\n\u003Cp>You can edit your robots.txt and add remote content to it.\u003Cbr \u002F>\nE.g. you have several sites and want to use a centralized robots.txt.\u003C\u002Fp>\n\u003Ch4>Features\u003C\u002Fh4>\n\u003Cul>\n\u003Cli>Include or exclude WordPress’ own robots.txt (core function)\u003C\u002Fli>\n\u003Cli>Include or exclude plugins – e.g. sitemap plugins – output to robots.txt (filter output)\u003C\u002Fli>\n\u003Cli>Include or exclude a remote text file (the common part)\u003C\u002Fli>\n\u003Cli>Include or exclude custom records from the settings page (the site specific part)\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Ch4>Where is robot.txt?\u003C\u002Fh4>\n\u003Cp>WordPress handles robots.txt as a virtual URL – just the same way as posts and pages.\u003C\u002Fp>\n\u003Cp>So when you browse to \u003Ccode>https:\u002F\u002Fexample.com\u002Frobots.txt\u003C\u002Fcode> WordPress generates robots.txt on the fly.\u003C\u002Fp>\n\u003Ch4>TODO\u003C\u002Fh4>\n\u003Cul>\n\u003Cli>add more description here\u003C\u002Fli>\n\u003Cli>add a video too\u003C\u002Fli>\n\u003Cli>add an admin notice for subdir installs (robots.txt is useless in a subdir)\u003C\u002Fli>\n\u003Cli>‘At least one “Disallow” field must be present in the robots.txt file.’ – check for that\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Ch4>Links\u003C\u002Fh4>\n\u003Cp>Development of this plugin goes on on \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fszepeviktor\u002Fmultipart-robotstxt-editor\" rel=\"nofollow ugc\">GitHub\u003C\u002Fa>.\u003C\u002Fp>\n","Customize your site's robots.txt and include remote content to it",2000,55853,66,3,"2018-02-17T05:52:00.000Z","4.9.29","4.7",[21,120,121,23,122],"robot","robots","search","https:\u002F\u002Fgithub.com\u002Fszepeviktor\u002Fmultipart-robotstxt-editor","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Fmultipart-robotstxt-editor.0.4.0.zip",{"slug":126,"name":127,"version":128,"author":129,"author_profile":130,"description":131,"short_description":132,"active_installs":133,"downloaded":134,"rating":135,"num_ratings":136,"last_updated":137,"tested_up_to":49,"requires_at_least":138,"requires_php":139,"tags":140,"homepage":17,"download_link":143,"security_score":11,"vuln_count":27,"unpatched_count":27,"last_vuln_date":28,"fetched_at":29},"block-ai-crawlers","Block AI Crawlers","1.5.6","lastsplash (a11n)","https:\u002F\u002Fprofiles.wordpress.org\u002Flastsplash\u002F","\u003Cp>Protect Your Content from AI Scraping\u003C\u002Fp>\n\u003Cp>This plugin helps you prevent AI crawlers from using your content as training data for their products. By updating your site’s \u003Ccode>robots.txt\u003C\u002Fcode>, it blocks common AI crawlers and scrapers, aiming to protect your content from being used in the training of Large Language Models (LLMs).\u003C\u002Fp>\n\u003Ch3>Features\u003C\u002Fh3>\n\u003Ch3>Blocks AI Crawlers\u003C\u002Fh3>\n\u003Cp>Includes:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>\u003Cstrong>OpenAI\u003C\u002Fstrong> – Blocks crawlers used for ChatGPT\u003C\u002Fli>\n\u003Cli>\u003Cstrong>Google\u003C\u002Fstrong> – Blocks crawlers used by Google’s Gemini AI products\u003C\u002Fli>\n\u003Cli>\u003Cstrong>Facebook \u002F Meta\u003C\u002Fstrong> – Used for Facebook’s AI training\u003C\u002Fli>\n\u003Cli>\u003Cstrong>Anthropic AI\u003C\u002Fstrong> – Blocks crawlers used by Anthropic  \u003C\u002Fli>\n\u003Cli>\u003Cstrong>Perplexity\u003C\u002Fstrong> – Block crawlers used by Perplexity\u003C\u002Fli>\n\u003Cli>\u003Cstrong>Applebot\u003C\u002Fstrong> – Blocks crawlers used by Apple\u003C\u002Fli>\n\u003Cli>… and more!\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Ch3>Experimental Meta Tags\u003C\u002Fh3>\n\u003Cp>The plugin adds the “noai, noimageai” directive to your site’s meta tags, instructing AI bots not to use your content in their datasets. Please note that these tags are experimental and have not been standardized.\u003C\u002Fp>\n\u003Ch3>Custom robots.txt Rules\u003C\u002Fh3>\n\u003Cp>Have custom entries for your robots.txt file? You can now add them directly through the plugin!\u003C\u002Fp>\n\u003Ch3>Usage\u003C\u002Fh3>\n\u003Cp>After activation, the plugin will automatically update your \u003Ccode>robots.txt\u003C\u002Fcode> and add the necessary meta tags. No further configuration is required, but you can check the settings page for a full list of blocked crawlers.\u003C\u002Fp>\n\u003Ch3>Limitations\u003C\u002Fh3>\n\u003Cp>While this plugin aims to block specified crawlers, it cannot guarantee complete protection against all forms of scraping, as some bots may disregard \u003Ccode>robots.txt\u003C\u002Fcode> directives.\u003C\u002Fp>\n\u003Ch3>Support\u003C\u002Fh3>\n\u003Cp>For questions or support, \u003Ca href=\"https:\u002F\u002Fwordpress.org\u002Fsupport\u002Fplugin\u002Fblock-ai-crawlers\u002F\" rel=\"ugc\">please post on the forums\u003C\u002Fa> or \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fbobmatyas\u002Fwp-block-ai-crawlers\u002Fissues\" rel=\"nofollow ugc\">on GitHub\u003C\u002Fa>.\u003C\u002Fp>\n","Tell AI (Artificial Intelligence) companies not to scrape your site for their AI products.",1000,13412,96,5,"2026-02-15T13:47:00.000Z","6.8","8.2",[141,142,19,23],"ai","chatgpt","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Fblock-ai-crawlers.1.5.6.zip",{"attackSurface":145,"codeSignals":165,"taintFlows":179,"riskAssessment":236,"analyzedAt":249},{"hooks":146,"ajaxHandlers":161,"restRoutes":162,"shortcodes":163,"cronEvents":164,"entryPointCount":27,"unprotectedCount":27},[147,153,157],{"type":148,"name":149,"callback":150,"file":151,"line":152},"action","wp_head","metarobots_mrs","metarobots-wp.php",127,{"type":148,"name":154,"callback":155,"file":156,"line":102},"admin_menu","mrs_plugin_menu","settings.php",{"type":148,"name":158,"callback":159,"file":156,"line":160},"admin_init","mrs_register_settings",6,[],[],[],[],{"dangerousFunctions":166,"sqlUsage":167,"outputEscaping":169,"fileOperations":177,"externalRequests":27,"nonceChecks":27,"capabilityChecks":27,"bundledLibraries":178},[],{"prepared":27,"raw":27,"locations":168},[],{"escaped":27,"rawEcho":115,"locations":170},[171,174,175],{"file":151,"line":172,"context":173},112,"raw output",{"file":156,"line":71,"context":173},{"file":156,"line":176,"context":173},108,10,[],[180,197,206,225],{"entryPoint":181,"graph":182,"unsanitizedCount":13,"severity":196},"metarobots_mrs (metarobots-wp.php:34)",{"nodes":183,"edges":193},[184,188],{"id":185,"type":186,"label":187,"file":151,"line":176},"n0","source","$_SERVER",{"id":189,"type":190,"label":191,"file":151,"line":172,"wp_function":192},"n1","sink","echo() [XSS]","echo",[194],{"from":185,"to":189,"sanitized":195},false,"medium",{"entryPoint":198,"graph":199,"unsanitizedCount":13,"severity":205},"\u003Cmetarobots-wp> (metarobots-wp.php:0)",{"nodes":200,"edges":203},[201,202],{"id":185,"type":186,"label":187,"file":151,"line":176},{"id":189,"type":190,"label":191,"file":151,"line":172,"wp_function":192},[204],{"from":185,"to":189,"sanitized":195},"low",{"entryPoint":207,"graph":208,"unsanitizedCount":102,"severity":205},"mrs_settings_page (settings.php:56)",{"nodes":209,"edges":222},[210,213,216,220],{"id":185,"type":186,"label":211,"file":156,"line":212},"$_POST['rewriteRobots']",63,{"id":189,"type":190,"label":214,"file":156,"line":212,"wp_function":215},"update_option() [Settings Manipulation]","update_option",{"id":217,"type":186,"label":218,"file":156,"line":219},"n2","$_POST['makeIndexFollow']",64,{"id":221,"type":190,"label":214,"file":156,"line":219,"wp_function":215},"n3",[223,224],{"from":185,"to":189,"sanitized":195},{"from":217,"to":221,"sanitized":195},{"entryPoint":226,"graph":227,"unsanitizedCount":102,"severity":205},"\u003Csettings> (settings.php:0)",{"nodes":228,"edges":233},[229,230,231,232],{"id":185,"type":186,"label":211,"file":156,"line":212},{"id":189,"type":190,"label":214,"file":156,"line":212,"wp_function":215},{"id":217,"type":186,"label":218,"file":156,"line":219},{"id":221,"type":190,"label":214,"file":156,"line":219,"wp_function":215},[234,235],{"from":185,"to":189,"sanitized":195},{"from":217,"to":221,"sanitized":195},{"summary":237,"deductions":238},"The \"meta-robots-by-seo-sign\" plugin version 1.0.0 exhibits a mixed security posture. On the positive side, the plugin has a seemingly small attack surface with no detected AJAX handlers, REST API routes, shortcodes, or cron events. Furthermore, it demonstrates good practices by using prepared statements for all its SQL queries and having no known vulnerabilities or CVEs in its history. This suggests a developer who is mindful of common attack vectors like SQL injection and has maintained a clean security record so far.\n\nHowever, significant concerns arise from the static analysis. The lack of any output escaping for its detected outputs is a critical weakness. This means that any data displayed to users could potentially be manipulated, leading to cross-site scripting (XSS) vulnerabilities. Additionally, the taint analysis revealed four flows with unsanitized paths, which, although not classified as critical or high severity in this instance, still indicate potential issues with how data is handled and could be exploited in conjunction with other weaknesses. The absence of nonce and capability checks also presents a risk, as it suggests that entry points, if they were to exist, might not be adequately protected against unauthorized access or manipulation.\n\nIn conclusion, while the plugin boasts a clean vulnerability history and good database practices, the complete lack of output escaping and the presence of unsanitized data flows are major security red flags. These issues, coupled with the absence of capability and nonce checks, significantly elevate the risk profile. Developers should prioritize addressing the output escaping and taint flow issues to improve the plugin's overall security.",[239,242,245,247],{"reason":240,"points":241},"Unescaped output detected",9,{"reason":243,"points":244},"Unsanitized paths in taint flows (4 flows)",8,{"reason":246,"points":136},"Missing nonce checks",{"reason":248,"points":136},"Missing capability checks","2026-03-16T21:11:28.840Z",{"wat":251,"direct":257},{"assetPaths":252,"generatorPatterns":254,"scriptPaths":255,"versionParams":256},[253],"\u002Fwp-content\u002Fplugins\u002Fmeta-robots-by-seo-sign\u002Fmetarobots.php",[],[],[],{"cssClasses":258,"htmlComments":259,"htmlAttributes":260,"restEndpoints":261,"jsGlobals":262,"shortcodeOutput":263},[],[],[],[],[],[]]