[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"$fjXrkBo4wvsNFqelgYZPRyUD3LptUnjw5SNL87jO8Uvc":3},{"slug":4,"name":5,"version":6,"author":7,"author_profile":8,"description":9,"short_description":10,"active_installs":11,"downloaded":12,"rating":13,"num_ratings":13,"last_updated":14,"tested_up_to":15,"requires_at_least":16,"requires_php":17,"tags":18,"homepage":24,"download_link":25,"security_score":26,"vuln_count":13,"unpatched_count":13,"last_vuln_date":27,"fetched_at":28,"vulnerabilities":29,"developer":30,"crawl_stats":27,"alternatives":35,"analysis":128,"fingerprints":369},"azayem-bots-tracker","Azayem Bots Tracker – Bot Visits Logger","1.0.0","botstracker","https:\u002F\u002Fprofiles.wordpress.org\u002Fbotstracker\u002F","\u003Cp>Bots Tracker is a custom-built plugin designed to:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Detect and log visits from search engine bots and crawlers.\u003C\u002Fli>\n\u003Cli>Store visits in a dedicated database table for analysis.\u003C\u002Fli>\n\u003Cli>Display detailed bot visits inside WordPress Admin (with filters, pagination, and per-URL crawl budget).\u003C\u002Fli>\n\u003Cli>Compare crawl budget between two time ranges (current vs previous period).\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>Main features:\u003C\u002Fp>\n\u003Col>\n\u003Cli>\n\u003Cp>\u003Cstrong>Bot Detection & Logging\u003C\u002Fstrong>\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Detects various bots (Googlebot, Bingbot, GPTBot, etc.) using User-Agent signatures.\u003C\u002Fli>\n\u003Cli>Saves bot name, visited URL path, IP address, and visit time to a custom database table.\u003C\u002Fli>\n\u003C\u002Ful>\n\u003C\u002Fli>\n\u003Cli>\n\u003Cp>\u003Cstrong>Admin Dashboard Pages\u003C\u002Fstrong>\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Main Report: paginate and filter bot visits, with search by URL, bot name, and date range.\u003C\u002Fli>\n\u003Cli>Crawl Budget by URL: aggregate visits per URL with total hits, unique bots, and first\u002Flast visit times.\u003C\u002Fli>\n\u003Cli>Comparative Report: compare total crawl budget between current and previous periods.\u003C\u002Fli>\n\u003C\u002Ful>\n\u003C\u002Fli>\n\u003Cli>\n\u003Cp>\u003Cstrong>Clean Uninstall\u003C\u002Fstrong>\u003C\u002Fp>\n\u003Cul>\n\u003Cli>On plugin deletion, the custom database table and plugin options are removed from WordPress to keep your database clean.\u003C\u002Fli>\n\u003C\u002Ful>\n\u003C\u002Fli>\n\u003Cli>\n\u003Cp>\u003Cstrong>Settings & Data Management\u003C\u002Fstrong>\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Configure how long bot visit logs are stored (data retention in days).\u003C\u002Fli>\n\u003Cli>Enable or disable bot filtering to store only selected bots (e.g., only Googlebot and Bingbot) or to exclude specific bots.\u003C\u002Fli>\n\u003Cli>Run on-demand database optimization for the bot visits table directly from the settings page.\u003C\u002Fli>\n\u003Cli>Manually delete old records using a one-click cleanup tool.\u003C\u002Fli>\n\u003Cli>Automatic cleanup via WP-Cron based on your configured retention period.\u003C\u002Fli>\n\u003C\u002Ful>\n\u003C\u002Fli>\n\u003C\u002Fol>\n\u003Ch3>Privacy\u003C\u002Fh3>\n\u003Cp>This plugin stores the following data about bot visits in a custom database table:\u003Cbr \u002F>\n– Bot name (derived from the User-Agent string)\u003Cbr \u002F>\n– Bot IP address\u003Cbr \u002F>\n– Visited URL path\u003Cbr \u002F>\n– Visit timestamp\u003C\u002Fp>\n\u003Cp>No data is sent to any third-party servers. This plugin tracks bots only, not human visitors.\u003C\u002Fp>\n","A custom plugin to log search engine bot visits and analyze their crawl budget directly inside WordPress (no external services required).",10,147,0,"2025-12-25T14:10:00.000Z","6.9.4","5.0","7.4",[19,20,21,22,23],"analytics","bots","crawl-budget","crawler","seo","","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Fazayem-bots-tracker.1.0.0.zip",100,null,"2026-03-15T15:16:48.613Z",[],{"slug":7,"display_name":7,"profile_url":8,"plugin_count":31,"total_installs":11,"avg_security_score":26,"avg_patch_time_days":32,"trust_score":33,"computed_at":34},1,30,94,"2026-04-05T02:05:41.452Z",[36,53,73,93,110],{"slug":37,"name":38,"version":39,"author":40,"author_profile":41,"description":42,"short_description":43,"active_installs":11,"downloaded":44,"rating":13,"num_ratings":13,"last_updated":45,"tested_up_to":46,"requires_at_least":16,"requires_php":47,"tags":48,"homepage":50,"download_link":51,"security_score":52,"vuln_count":13,"unpatched_count":13,"last_vuln_date":27,"fetched_at":28},"seobot-monitor","SEObot Monitor for Googlebot, Bingbot and search engine spiders","2.0.0","Santiago Alonso","https:\u002F\u002Fprofiles.wordpress.org\u002Fsalonsoweb\u002F","\u003Cp>With SEObot Monitor for Googlebot you will be able to dump the server logs to Google Analytics, easily and automatically.\u003C\u002Fp>\n\u003Cp>This will allow you to detect SEO failures before it will be too late\u003C\u002Fp>\n\u003Ch3>Googlebot Monitor features\u003C\u002Fh3>\n\u003Cp>This plugin allows you to monitor bots such as used by Google or Bing and dump the data into Google Analytics. In this way, access to key information from the server logs is easier. The track is automatic, and does not require any action by the team managing the server or web development.\u003C\u002Fp>\n\u003Cp>Log analysis can help you understand if search engines are visiting all pages or if some pages are having crawling problems.\u003C\u002Fp>\n\u003Ch3>How it works\u003C\u002Fh3>\n\u003Cp>To activate it and make it work properly, you must:\u003C\u002Fp>\n\u003Col>\n\u003Cli>Create a new property in your GA account (IMPORTANT: NEVER use the same Google Analytics UA-XXXXX that you usually use for tracking)\u003C\u002Fli>\n\u003Cli>Enter the UA of the new property in the plugin options.\u003C\u002Fli>\n\u003Cli>You can extend the regular expression to register additional bots (Advanced users only)\u003C\u002Fli>\n\u003C\u002Fol>\n\u003Ch3>Plugin compatibility\u003C\u002Fh3>\n\u003Cp>Googlebot Monitor is fully compatible with WordPress and also with YOAST SEO plugin\u003C\u002Fp>\n\u003Cp>This plugin use native WordPress hooks for total compatibility.\u003C\u002Fp>\n","With SEObot Monitor for Googlebot you will be able to dump the server logs to Google Analytics, easily and automatically. This will allow you to dete &hellip;",1210,"2021-08-02T12:47:00.000Z","5.8.13","5.2.4",[19,20,22,49,23],"google","https:\u002F\u002Fbuenamanera.com","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Fseobot-monitor.2.0.0.zip",85,{"slug":54,"name":55,"version":56,"author":57,"author_profile":58,"description":59,"short_description":60,"active_installs":61,"downloaded":62,"rating":63,"num_ratings":64,"last_updated":65,"tested_up_to":66,"requires_at_least":67,"requires_php":68,"tags":69,"homepage":24,"download_link":72,"security_score":52,"vuln_count":13,"unpatched_count":13,"last_vuln_date":27,"fetched_at":28},"robots-txt-editor","Robots.txt Editor","1.1.4","Processby","https:\u002F\u002Fprofiles.wordpress.org\u002Fprocessby\u002F","\u003Cp>The plugin allows you to create and edit the robots.txt file on your site.\u003C\u002Fp>\n\u003Ch4>Features\u003C\u002Fh4>\n\u003Cul>\n\u003Cli>Works with multisite network on Subdomains;\u003C\u002Fli>\n\u003Cli>An example of the correct file for WordPress;\u003C\u002Fli>\n\u003Cli>Works out of the box;\u003C\u002Fli>\n\u003Cli>Totally Free.\u003C\u002Fli>\n\u003C\u002Ful>\n","Robots.txt for WordPress",10000,111434,90,8,"2021-01-16T00:07:00.000Z","5.6.17","4.0","5.6",[22,70,71,23],"robots","robots-txt","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Frobots-txt-editor.zip",{"slug":74,"name":75,"version":76,"author":77,"author_profile":78,"description":79,"short_description":80,"active_installs":81,"downloaded":82,"rating":63,"num_ratings":83,"last_updated":84,"tested_up_to":15,"requires_at_least":16,"requires_php":17,"tags":85,"homepage":24,"download_link":89,"security_score":90,"vuln_count":91,"unpatched_count":13,"last_vuln_date":92,"fetched_at":28},"better-robots-txt","Better Robots.txt – AI-Ready Crawl Control & Bot Governance","3.0.0","Pagup","https:\u002F\u002Fprofiles.wordpress.org\u002Fpagup\u002F","\u003Cp>Better Robots.txt replaces the default WordPress robots.txt workflow with a smarter, structured version you can configure and preview before publishing.\u003C\u002Fp>\n\u003Cp>Instead of a blank textarea, you get a guided wizard with presets, plain-language explanations, and a final Review & Save step so you can inspect the generated robots.txt before it goes live.\u003C\u002Fp>\n\u003Cp>Built for beginners and advanced users alike, Better Robots.txt helps you control how search engines, AI crawlers, SEO tools, archive bots, bad bots, social preview bots, and other automated agents interact with your site.\u003C\u002Fp>\n\u003Cp>Trusted by thousands of WordPress sites, Better Robots.txt is designed for the AI era without resorting to hype, vague promises, or hidden rules.\u003C\u002Fp>\n\u003Cp>Better Robots.txt is available in Free, Pro, and Premium editions. The free plugin covers the guided workflow and essential crawl control features, while Pro and Premium unlock additional governance, protection, and AI-ready modules. Some screenshots on the plugin page show features from all three editions.\u003C\u002Fp>\n\u003Ch3>A quick overview\u003C\u002Fh3>\n\u003Cp>\u003Ciframe loading=\"lazy\" title=\"Better robots.txt Video — AI-Ready Crawl Control for WordPress\" src=\"https:\u002F\u002Fplayer.vimeo.com\u002Fvideo\u002F1169756981?dnt=1&app_id=122963\" width=\"750\" height=\"372\" frameborder=\"0\" allow=\"autoplay; fullscreen; picture-in-picture; clipboard-write; encrypted-media; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\">\u003C\u002Fiframe>\u003C\u002Fp>\n\u003Ch3>Why Better Robots.txt is different\u003C\u002Fh3>\n\u003Cp>Most robots.txt plugins fall into one of three categories:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Simple text editor\u003C\u002Fli>\n\u003Cli>Virtual robots.txt manager\u003C\u002Fli>\n\u003Cli>Single-purpose AI or policy add-on\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>Better Robots.txt goes further.\u003C\u002Fp>\n\u003Cp>It gives you a complete, guided crawl control workflow so you can:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Choose a preset that matches your goals\u003C\u002Fli>\n\u003Cli>Control major crawler categories without writing everything by hand\u003C\u002Fli>\n\u003Cli>Keep core WordPress protection rules visible and editable\u003C\u002Fli>\n\u003Cli>Clean up low-value crawl paths that waste crawl budget\u003C\u002Fli>\n\u003Cli>Generate a cleaner robots.txt output\u003C\u002Fli>\n\u003Cli>Preview the final result before saving\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Ch3>What you can control\u003C\u002Fh3>\n\u003Cp>Better Robots.txt helps you manage:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Search engine visibility\u003C\u002Fli>\n\u003Cli>AI and LLM crawler behavior\u003C\u002Fli>\n\u003Cli>AI usage signals such as search, ai-input, and ai-train preferences\u003C\u002Fli>\n\u003Cli>SEO tool crawlers\u003C\u002Fli>\n\u003Cli>Bad bots and abusive crawlers\u003C\u002Fli>\n\u003Cli>Archive and Wayback access\u003C\u002Fli>\n\u003Cli>Feed crawlers and crawl traps\u003C\u002Fli>\n\u003Cli>WooCommerce crawl cleanup\u003C\u002Fli>\n\u003Cli>CSS, JavaScript, and image crawling rules\u003C\u002Fli>\n\u003Cli>Social media preview crawlers\u003C\u002Fli>\n\u003Cli>ads.txt and app-ads.txt allowance\u003C\u002Fli>\n\u003Cli>llms.txt generation\u003C\u002Fli>\n\u003Cli>Advanced directives such as crawl-delay and custom rules\u003C\u002Fli>\n\u003Cli>Final review before publishing\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Ch3>Editions\u003C\u002Fh3>\n\u003Cp>Better Robots.txt is available in three editions:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Free – Includes the guided setup, the Essential preset, core crawl control features, and the final Review & Save workflow.\u003C\u002Fli>\n\u003Cli>Pro – Adds more advanced governance and protection modules, including additional AI, crawler, and cleanup controls.\u003C\u002Fli>\n\u003Cli>Premium – Unlocks the most restrictive and advanced protection options, including the Fortress preset and additional high-control modules.\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>Some options shown in the interface are marked Free, Pro, or Premium so users can immediately understand which modules belong to each edition.\u003C\u002Fp>\n\u003Ch3>Presets\u003C\u002Fh3>\n\u003Cp>Setup starts with four modes:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Essential – A clean, practical configuration for most websites that want a better robots.txt without complexity.\u003C\u002Fli>\n\u003Cli>AI-First – For publishers and content sites that want AI-ready governance without shutting down discovery.\u003C\u002Fli>\n\u003Cli>Fortress – For websites that want stronger protection against scraping, archive capture, and unnecessary crawl activity.\u003C\u002Fli>\n\u003Cli>Custom – For users who prefer to configure each module manually.\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>For many sites, one preset plus a quick review is enough.\u003C\u002Fp>\n\u003Ch3>Built for beginners and experts\u003C\u002Fh3>\n\u003Cp>Beginners get:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>A guided setup instead of a raw robots.txt box\u003C\u002Fli>\n\u003Cli>Preset-based configuration\u003C\u002Fli>\n\u003Cli>Plain-language explanations for important choices\u003C\u002Fli>\n\u003Cli>A safer workflow with a final preview step\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>Advanced users get:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Editable core WordPress protection rules\u003C\u002Fli>\n\u003Cli>Fine-grained crawler controls by category\u003C\u002Fli>\n\u003Cli>WooCommerce-oriented cleanup options\u003C\u002Fli>\n\u003Cli>Consolidated output options\u003C\u002Fli>\n\u003Cli>Advanced directives and custom rules\u003C\u002Fli>\n\u003Cli>A final output they can inspect before publishing\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Ch3>AI-ready, without hype\u003C\u002Fh3>\n\u003Cp>Better Robots.txt includes features for modern AI-related crawl governance, including:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>AI crawler handling\u003C\u002Fli>\n\u003Cli>Optional llms.txt support\u003C\u002Fli>\n\u003Cli>AI usage signals for compliant systems\u003C\u002Fli>\n\u003Cli>Optional machine-readable governance signals for advanced use cases\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>These features help you express how you want automated systems to use your content.\u003C\u002Fp>\n\u003Cp>However, Better Robots.txt does not claim to control AI by force. Like robots.txt itself, these signals are most useful with compliant systems and good-faith crawlers.\u003C\u002Fp>\n\u003Ch3>What Better Robots.txt is\u003C\u002Fh3>\n\u003Cp>Better Robots.txt is:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>A robots.txt governance plugin for WordPress\u003C\u002Fli>\n\u003Cli>A guided configuration workflow instead of a raw text editor\u003C\u002Fli>\n\u003Cli>A crawl control layer to reduce wasteful crawling\u003C\u002Fli>\n\u003Cli>A practical bridge between SEO, crawl hygiene, and AI-era policy signaling\u003C\u002Fli>\n\u003Cli>A way to keep your crawl policy clearer for humans and machines\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>Technical reference for advanced users: Better Robots.txt also maintains a public \u003Ca href=\"https:\u002F\u002Fgithub.com\u002FGautierDorval\u002Fbetter-robots-txt\" rel=\"nofollow noopener noreferrer ugc\">GitHub repository\u003C\u002Fa> with product definition, governance notes, and machine-readable artefacts.\u003C\u002Fp>\n\u003Ch3>What Better Robots.txt is not\u003C\u002Fh3>\n\u003Cp>Better Robots.txt is not:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>A firewall or Web Application Firewall (WAF)\u003C\u002Fli>\n\u003Cli>An anti-scraping enforcement engine\u003C\u002Fli>\n\u003Cli>A legal compliance engine\u003C\u002Fli>\n\u003Cli>A guarantee that every bot will obey your rules\u003C\u002Fli>\n\u003Cli>A replacement for server-level security or access control\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>It helps you publish a clearer crawl policy.\u003C\u002Fp>\n\u003Cp>It does not replace infrastructure-level protection.\u003C\u002Fp>\n\u003Ch3>Typical use cases\u003C\u002Fh3>\n\u003Cp>Use Better Robots.txt if you want to:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Clean up a weak or noisy default robots.txt\u003C\u002Fli>\n\u003Cli>Reduce crawl waste on WordPress or WooCommerce\u003C\u002Fli>\n\u003Cli>Keep major search engines allowed while restricting other bots\u003C\u002Fli>\n\u003Cli>Control whether archive bots can snapshot your site\u003C\u002Fli>\n\u003Cli>Publish AI usage preferences more clearly\u003C\u002Fli>\n\u003Cli>Keep social preview bots allowed while limiting scrapers\u003C\u002Fli>\n\u003Cli>Review the final file before making it live\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Ch3>Key Features\u003C\u002Fh3>\n\u003Cul>\n\u003Cli>Guided step-by-step wizard\u003C\u002Fli>\n\u003Cli>Preset-based setup: Essential, AI-First, Fortress, Custom\u003C\u002Fli>\n\u003Cli>Search engine visibility controls\u003C\u002Fli>\n\u003Cli>AI and LLM crawler governance\u003C\u002Fli>\n\u003Cli>AI usage signals support\u003C\u002Fli>\n\u003Cli>SEO tool crawler controls\u003C\u002Fli>\n\u003Cli>Bad bot and abusive crawler options\u003C\u002Fli>\n\u003Cli>Archive and Wayback access controls\u003C\u002Fli>\n\u003Cli>Spam, feed, and crawl trap cleanup\u003C\u002Fli>\n\u003Cli>WooCommerce crawl cleanup options\u003C\u002Fli>\n\u003Cli>CSS, JavaScript, and image crawling rules\u003C\u002Fli>\n\u003Cli>Social media preview crawler controls\u003C\u002Fli>\n\u003Cli>ads.txt and app-ads.txt allowance\u003C\u002Fli>\n\u003Cli>Optional llms.txt generation\u003C\u002Fli>\n\u003Cli>Consolidated output option\u003C\u002Fli>\n\u003Cli>Core WordPress protection rules remain visible and editable\u003C\u002Fli>\n\u003Cli>Final Review & Save preview screen\u003C\u002Fli>\n\u003C\u002Ful>\n","Replace the default WordPress robots.txt workflow with a smarter, structured version you can preview before publishing, with Free, Pro, and Premium ed &hellip;",6000,305034,102,"2026-03-10T18:33:00.000Z",[86,87,88,71,23],"ai-crawlers","bot-blocker","llms-txt","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Fbetter-robots-txt.3.0.0.zip",99,2,"2023-02-14 00:00:00",{"slug":94,"name":95,"version":96,"author":97,"author_profile":98,"description":99,"short_description":100,"active_installs":101,"downloaded":102,"rating":26,"num_ratings":103,"last_updated":104,"tested_up_to":15,"requires_at_least":105,"requires_php":106,"tags":107,"homepage":108,"download_link":109,"security_score":26,"vuln_count":13,"unpatched_count":13,"last_vuln_date":27,"fetched_at":28},"db-robotstxt","Bisteinoff SEO Robots.txt","4.0.3","Denis Bisteinov","https:\u002F\u002Fprofiles.wordpress.org\u002Fbisteinoff\u002F","\u003Cp>Have you encountered an obstacle while creating and editing robots.txt file on your website?\u003C\u002Fp>\n\u003Cp>Bisteinoff SEO Robots.txt is an easy-to-use plugin that helps you generate and configure a correct robots.txt file, which is essential for search engine optimization (SEO). This file defines crawling rules for search engine bots such as Google, Bing, Yahoo!, Yandex, and others.\u003C\u002Fp>\n\u003Cp>The plugin works perfectly both if the file robots.txt has never been created or if it already exists. Once installed the plugin makes an optimized robots.txt file that includes special rules common for WordPress websites. After that you can proceed further customization specific for your own website if needed.\u003C\u002Fp>\n\u003Cp>If the plugin detects one or several Sitemap XML files it will include them into robots.txt file.\u003C\u002Fp>\n\u003Cp>No FTP access, manual coding or file editing is required that makes managing settings easy and convenient!\u003C\u002Fp>\n\u003Ch4>Key Features\u003C\u002Fh4>\n\u003Cul>\n\u003Cli>Automatic generation of optimized robots.txt with WordPress-specific rules\u003C\u002Fli>\n\u003Cli>Special rules for Google and Yandex search engines\u003C\u002Fli>\n\u003Cli>Custom rules support for any search engine bot\u003C\u002Fli>\n\u003Cli>Automatic sitemap detection and inclusion\u003C\u002Fli>\n\u003Cli>WooCommerce compatibility with specific rules\u003C\u002Fli>\n\u003Cli>Multisite support\u003C\u002Fli>\n\u003Cli>Easy-to-use admin interface\u003C\u002Fli>\n\u003Cli>Modern PHP architecture with namespaces for conflict-free operation\u003C\u002Fli>\n\u003C\u002Ful>\n","An easy-to-use plugin that generates and configures a proper robots.txt file, essential for effective search engine optimization (SEO).",500,10243,4,"2025-12-19T00:04:00.000Z","4.6","7.0",[22,49,70,71,23],"https:\u002F\u002Fgithub.com\u002Fbisteinoff\u002Fdb-robotstxt","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Fdb-robotstxt.4.0.3.zip",{"slug":111,"name":112,"version":113,"author":114,"author_profile":115,"description":116,"short_description":117,"active_installs":118,"downloaded":119,"rating":26,"num_ratings":120,"last_updated":121,"tested_up_to":15,"requires_at_least":16,"requires_php":17,"tags":122,"homepage":126,"download_link":127,"security_score":26,"vuln_count":13,"unpatched_count":13,"last_vuln_date":27,"fetched_at":28},"ai-content-signals","AI Content Signals","1.0.1","Fernando Tellado","https:\u002F\u002Fprofiles.wordpress.org\u002Ffernandot\u002F","\u003Cp>AI Content Signals allows you to easily implement the Content Signals Policy in your WordPress site’s robots.txt file. This gives you more control over how AI crawlers and large language models (LLMs) can use your content.\u003C\u002Fp>\n\u003Cp>\u003Cstrong>What are Content Signals?\u003C\u002Fstrong>\u003C\u002Fp>\n\u003Cp>Content Signals is an extension to the robots.txt standard created by Cloudflare that lets you specify three types of permissions for AI crawlers:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>\u003Cstrong>search\u003C\u002Fstrong> – Allow or deny search indexing and traditional search results\u003C\u002Fli>\n\u003Cli>\u003Cstrong>ai-input\u003C\u002Fstrong> – Allow or deny using your content for real-time AI responses (RAG, AI Overviews)\u003C\u002Fli>\n\u003Cli>\u003Cstrong>ai-train\u003C\u002Fstrong> – Allow or deny using your content for training AI models\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>\u003Cstrong>Key Features\u003C\u002Fstrong>\u003C\u002Fp>\n\u003Cul>\n\u003Cli>Easy-to-use settings page in WordPress admin\u003C\u002Fli>\n\u003Cli>Set global defaults for all crawlers\u003C\u002Fli>\n\u003Cli>Configure specific settings for individual AI bots (GPTBot, ClaudeBot, PerplexityBot, etc.)\u003C\u002Fli>\n\u003Cli>Add custom bot User-Agents\u003C\u002Fli>\n\u003Cli>Supports both physical and virtual robots.txt files\u003C\u002Fli>\n\u003Cli>\u003Cstrong>Option to create physical robots.txt with basic WordPress rules\u003C\u002Fstrong>\u003C\u002Fli>\n\u003Cli>Preview generated Content Signals before applying\u003C\u002Fli>\n\u003Cli>Optional legal text with EU Directive reference\u003C\u002Fli>\n\u003Cli>Works with existing robots.txt from SEO plugins\u003C\u002Fli>\n\u003Cli>Automatic sitemap detection and inclusion\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>\u003Cstrong>Supported Bots\u003C\u002Fstrong>\u003C\u002Fp>\n\u003Cp>The plugin includes predefined settings for major AI crawlers:\u003C\u002Fp>\n\u003Cul>\n\u003Cli>OpenAI GPTBot and ChatGPT-User\u003C\u002Fli>\n\u003Cli>Anthropic ClaudeBot and Claude-Web\u003C\u002Fli>\n\u003Cli>Perplexity Bot\u003C\u002Fli>\n\u003Cli>Google Extended (Bard\u002FGemini)\u003C\u002Fli>\n\u003Cli>Common Crawl Bot\u003C\u002Fli>\n\u003Cli>Meta\u002FFacebook Bot\u003C\u002Fli>\n\u003Cli>And many more…\u003C\u002Fli>\n\u003C\u002Ful>\n\u003Cp>\u003Cstrong>Important Notice\u003C\u002Fstrong>\u003C\u002Fp>\n\u003Cp>Content Signals is a declarative standard – it expresses your preferences but does not technically enforce them. AI companies are not legally required to respect these signals, though the plugin includes legal text referencing EU copyright directives.\u003C\u002Fp>\n\u003Cp>This plugin works best when combined with other protection measures like traditional robots.txt rules and server-level bot management.\u003C\u002Fp>\n","Add Content Signals to your robots.txt to control how AI crawlers can use your content.",200,605,3,"2025-12-28T17:52:00.000Z",[123,124,125,71,23],"ai","cloudflare","crawlers","https:\u002F\u002Fservicios.ayudawp.com","https:\u002F\u002Fdownloads.wordpress.org\u002Fplugin\u002Fai-content-signals.1.0.1.zip",{"attackSurface":129,"codeSignals":172,"taintFlows":232,"riskAssessment":359,"analyzedAt":368},{"hooks":130,"ajaxHandlers":166,"restRoutes":167,"shortcodes":168,"cronEvents":169,"entryPointCount":13,"unprotectedCount":13},[131,137,141,146,151,156,162],{"type":132,"name":133,"callback":134,"file":135,"line":136},"action","plugins_loaded","init","bots-tracker.php",54,{"type":132,"name":138,"callback":139,"file":135,"line":140},"admin_init","bt_bots_tracker_add_privacy_policy_content",83,{"type":132,"name":142,"callback":143,"file":144,"line":145},"admin_menu","register_menu","includes\\admin\\class-bt-admin-menu.php",45,{"type":132,"name":147,"callback":148,"file":149,"line":150},"admin_post_bt_bots_export_crawl_csv","export_csv","includes\\admin\\pages\\class-bt-page-crawl.php",1250,{"type":132,"name":152,"callback":153,"file":154,"line":155},"bt_bots_tracker_cleanup_event","run_cleanup","includes\\class-bt-loader.php",158,{"type":157,"name":158,"callback":159,"priority":11,"file":160,"line":161},"filter","status_header","capture_status_code","includes\\class-bt-tracker.php",19,{"type":132,"name":163,"callback":164,"file":160,"line":165},"shutdown","maybe_track_bot_visit",23,[],[],[],[170],{"hook":152,"callback":152,"file":154,"line":171},44,{"dangerousFunctions":173,"sqlUsage":174,"outputEscaping":193,"fileOperations":31,"externalRequests":13,"nonceChecks":176,"capabilityChecks":103,"bundledLibraries":231},[],{"prepared":175,"raw":176,"locations":177},41,5,[178,182,185,188,190],{"file":179,"line":180,"context":181},"includes\\admin\\pages\\class-bt-page-setting.php",260,"$wpdb->get_var() with variable interpolation",{"file":179,"line":183,"context":184},263,"$wpdb->query() with variable interpolation",{"file":186,"line":187,"context":184},"includes\\class-bt-database.php",115,{"file":186,"line":189,"context":184},593,{"file":191,"line":192,"context":184},"uninstall.php",21,{"escaped":194,"rawEcho":195,"locations":196},173,16,[197,200,202,204,206,208,211,213,215,217,219,221,223,225,227,229],{"file":149,"line":198,"context":199},610,"raw output",{"file":149,"line":201,"context":199},904,{"file":149,"line":203,"context":199},906,{"file":149,"line":205,"context":199},939,{"file":149,"line":207,"context":199},941,{"file":209,"line":210,"context":199},"includes\\admin\\pages\\class-bt-page-main.php",660,{"file":209,"line":212,"context":199},662,{"file":209,"line":214,"context":199},695,{"file":209,"line":216,"context":199},697,{"file":179,"line":218,"context":199},86,{"file":179,"line":220,"context":199},234,{"file":179,"line":222,"context":199},244,{"file":179,"line":224,"context":199},270,{"file":179,"line":226,"context":199},279,{"file":179,"line":228,"context":199},442,{"file":179,"line":230,"context":199},452,[],[233,252,265,278,313,330,348],{"entryPoint":234,"graph":235,"unsanitizedCount":13,"severity":251},"render (includes\\admin\\pages\\class-bt-page-crawl.php:148)",{"nodes":236,"edges":248},[237,242],{"id":238,"type":239,"label":240,"file":149,"line":241},"n0","source","$_GET (x17)",189,{"id":243,"type":244,"label":245,"file":149,"line":246,"wp_function":247},"n1","sink","echo() [XSS]",544,"echo",[249],{"from":238,"to":243,"sanitized":250},true,"low",{"entryPoint":253,"graph":254,"unsanitizedCount":13,"severity":251},"export_csv (includes\\admin\\pages\\class-bt-page-crawl.php:1005)",{"nodes":255,"edges":263},[256,259],{"id":238,"type":239,"label":257,"file":149,"line":258},"$_GET",1021,{"id":243,"type":244,"label":260,"file":149,"line":261,"wp_function":262},"header() [Header Injection]",1157,"header",[264],{"from":238,"to":243,"sanitized":250},{"entryPoint":266,"graph":267,"unsanitizedCount":13,"severity":251},"\u003Cclass-bt-page-crawl> (includes\\admin\\pages\\class-bt-page-crawl.php:0)",{"nodes":268,"edges":275},[269,270,271,273],{"id":238,"type":239,"label":240,"file":149,"line":241},{"id":243,"type":244,"label":245,"file":149,"line":246,"wp_function":247},{"id":272,"type":239,"label":257,"file":149,"line":258},"n2",{"id":274,"type":244,"label":260,"file":149,"line":261,"wp_function":262},"n3",[276,277],{"from":238,"to":243,"sanitized":250},{"from":272,"to":274,"sanitized":250},{"entryPoint":279,"graph":280,"unsanitizedCount":13,"severity":251},"render (includes\\admin\\pages\\class-bt-page-main.php:149)",{"nodes":281,"edges":308},[282,284,288,290,294,298,301,305],{"id":238,"type":239,"label":257,"file":209,"line":283},269,{"id":243,"type":244,"label":285,"file":209,"line":286,"wp_function":287},"get_var() [SQLi]",302,"get_var",{"id":272,"type":239,"label":289,"file":209,"line":283},"$_GET (x2)",{"id":274,"type":244,"label":291,"file":209,"line":292,"wp_function":293},"get_results() [SQLi]",328,"get_results",{"id":295,"type":239,"label":296,"file":209,"line":297},"n4","$_GET (x9)",186,{"id":299,"type":244,"label":245,"file":209,"line":300,"wp_function":247},"n5",486,{"id":302,"type":239,"label":303,"file":209,"line":304},"n6","$_SERVER (x3)",394,{"id":306,"type":244,"label":245,"file":209,"line":307,"wp_function":247},"n7",586,[309,310,311,312],{"from":238,"to":243,"sanitized":250},{"from":272,"to":274,"sanitized":250},{"from":295,"to":299,"sanitized":250},{"from":302,"to":306,"sanitized":250},{"entryPoint":314,"graph":315,"unsanitizedCount":13,"severity":251},"\u003Cclass-bt-page-main> (includes\\admin\\pages\\class-bt-page-main.php:0)",{"nodes":316,"edges":325},[317,318,319,320,321,322,323,324],{"id":238,"type":239,"label":257,"file":209,"line":283},{"id":243,"type":244,"label":285,"file":209,"line":286,"wp_function":287},{"id":272,"type":239,"label":289,"file":209,"line":283},{"id":274,"type":244,"label":291,"file":209,"line":292,"wp_function":293},{"id":295,"type":239,"label":296,"file":209,"line":297},{"id":299,"type":244,"label":245,"file":209,"line":300,"wp_function":247},{"id":302,"type":239,"label":303,"file":209,"line":304},{"id":306,"type":244,"label":245,"file":209,"line":307,"wp_function":247},[326,327,328,329],{"from":238,"to":243,"sanitized":250},{"from":272,"to":274,"sanitized":250},{"from":295,"to":299,"sanitized":250},{"from":302,"to":306,"sanitized":250},{"entryPoint":331,"graph":332,"unsanitizedCount":13,"severity":251},"render (includes\\admin\\pages\\class-bt-page-setting.php:20)",{"nodes":333,"edges":345},[334,337,341,343],{"id":238,"type":239,"label":335,"file":179,"line":336},"$_POST (x4)",63,{"id":243,"type":244,"label":338,"file":179,"line":339,"wp_function":340},"update_option() [Settings Manipulation]",71,"update_option",{"id":272,"type":239,"label":342,"file":179,"line":336},"$_POST (x2)",{"id":274,"type":244,"label":245,"file":179,"line":344,"wp_function":247},386,[346,347],{"from":238,"to":243,"sanitized":250},{"from":272,"to":274,"sanitized":250},{"entryPoint":349,"graph":350,"unsanitizedCount":13,"severity":251},"\u003Cclass-bt-page-setting> (includes\\admin\\pages\\class-bt-page-setting.php:0)",{"nodes":351,"edges":356},[352,353,354,355],{"id":238,"type":239,"label":335,"file":179,"line":336},{"id":243,"type":244,"label":338,"file":179,"line":339,"wp_function":340},{"id":272,"type":239,"label":342,"file":179,"line":336},{"id":274,"type":244,"label":245,"file":179,"line":344,"wp_function":247},[357,358],{"from":238,"to":243,"sanitized":250},{"from":272,"to":274,"sanitized":250},{"summary":360,"deductions":361},"The \"azayem-bots-tracker\" plugin v1.0.0 exhibits a generally good security posture based on the provided static analysis. The absence of any known CVEs in its history and the low occurrence of security-sensitive code signals are positive indicators.  The plugin demonstrates strong adherence to secure coding practices, with a high percentage of SQL queries using prepared statements and output being properly escaped. Nonce and capability checks are also present, albeit in limited numbers. The limited attack surface, with no unprotected AJAX handlers, REST API routes, or shortcodes, further contributes to its apparent safety.  However, the presence of a cron event represents a potential, albeit currently unrealized, entry point for code execution. While no vulnerabilities were identified in the static analysis or taint flows, the plugin's simplicity and lack of complex features may contribute to this. It's important to note that this assessment is based solely on the provided data, and a more comprehensive review would involve dynamic analysis and code auditing.",[362,364,366],{"reason":363,"points":176},"Cron event exists, potential entry point",{"reason":365,"points":120},"Limited number of nonce\u002Fcapability checks",{"reason":367,"points":91},"One file operation found","2026-03-16T23:55:47.976Z",{"wat":370,"direct":379},{"assetPaths":371,"generatorPatterns":374,"scriptPaths":375,"versionParams":376},[372,373],"\u002Fwp-content\u002Fplugins\u002Fazayem-bots-tracker\u002Fincludes\u002Fassets\u002Fcss\u002Fstyle.css","\u002Fwp-content\u002Fplugins\u002Fazayem-bots-tracker\u002Fincludes\u002Fassets\u002Fjs\u002Fscript.js",[],[373],[377,378],"azayem-bots-tracker\u002Fincludes\u002Fassets\u002Fcss\u002Fstyle.css?ver=","azayem-bots-tracker\u002Fincludes\u002Fassets\u002Fjs\u002Fscript.js?ver=",{"cssClasses":380,"htmlComments":381,"htmlAttributes":382,"restEndpoints":383,"jsGlobals":384,"shortcodeOutput":386},[],[],[],[],[385],"bt_bots_tracker_admin_vars",[]]