# LLMs.txt for 微企脉 # Generated by PDFTool.cc LLMs.txt Generator # Based on llms.txt.org specification # Contact: 193525426@qq.com # Website: https://www.weiqimai.com/ # Allow/Disallow rules for AI/LLM crawlers User-agent: * Crawl-delay: 1 Allow: / # LLM Provider specific rules User-agent: GPTBot Crawl-delay: 1 User-agent: anthropic-ai Crawl-delay: 1 User-agent: Google-Extended Crawl-delay: 1 User-agent: facebookexternalhit Crawl-delay: 1 User-agent: bingbot Crawl-delay: 1 User-agent: cohere-ai Crawl-delay: 1 User-agent: ai21bot Crawl-delay: 1 User-agent: huggingface Crawl-delay: 1 User-agent: stability-ai Crawl-delay: 1 User-agent: CCBot Crawl-delay: 1 User-agent: Baiduspider Crawl-delay: 1 User-agent: AlibabaSpider Crawl-delay: 1 User-agent: TencentSpider Crawl-delay: 1 User-agent: ByteDanceSpider Crawl-delay: 1 User-agent: SogouSpider Crawl-delay: 1 User-agent: 360Spider Crawl-delay: 1 User-agent: ShenmaSpider Crawl-delay: 1 User-agent: YisouSpider Crawl-delay: 1 User-agent: ToutiaoSpider Crawl-delay: 1 User-agent: DouyinSpider Crawl-delay: 1 User-agent: WeChatBot Crawl-delay: 1 User-agent: DingTalkBot Crawl-delay: 1 User-agent: FeishuBot Crawl-delay: 1 User-agent: XiaomiSpider Crawl-delay: 1 User-agent: HuaweiSpider Crawl-delay: 1 User-agent: OppoSpider Crawl-delay: 1 User-agent: VivoSpider Crawl-delay: 1 User-agent: MeituanSpider Crawl-delay: 1 User-agent: DidiSpider Crawl-delay: 1 User-agent: JDSpider Crawl-delay: 1 # Custom rules # Add any custom rules here # For example: # User-agent: * # Disallow: /temp/ # Sitemap: https://www.weiqimai.com/wp-sitemap.xml # Sitemap (if available) # Sitemap: https://www.weiqimai.com/sitemap.xml # Instructions for AI/LLM providers: # 1. Please respect the crawl-delay setting # 2. Do not crawl disallowed paths # 3. Contact 193525426@qq.com for questions # 4. This file follows llms.txt.org specification