⚝
One Hat Cyber Team
⚝
Your IP:
216.73.216.30
Server IP:
45.79.8.107
Server:
Linux localhost 5.15.0-140-generic #150-Ubuntu SMP Sat Apr 12 06:00:09 UTC 2025 x86_64
Server Software:
nginx/1.18.0
PHP Version:
8.1.2-1ubuntu2.21
Buat File
|
Buat Folder
Eksekusi
Dir :
~
/
lib
/
python3.10
/
urllib
/
__pycache__
/
View File Name :
robotparser.cpython-310.pyc
o }ò5hÐ$ ã @ s\ d Z ddlZddlZddlZdgZe dd¡ZG dd„ dƒZG dd„ dƒZ G d d „ d ƒZ dS )a% robotparser.py Copyright (C) 2000 Bastian Kleineidam You can choose between two licenses when using this package: 1) GNU GPLv2 2) PSF license for Python 2.2 The robots.txt Exclusion Protocol is implemented as specified in http://www.robotstxt.org/norobots-rfc.txt é NÚRobotFileParserÚRequestRatezrequests secondsc @ sr e Zd ZdZddd„Zdd„ Zdd„ Zd d „ Zdd„ Zd d„ Z dd„ Z dd„ Zdd„ Zdd„ Z dd„ Zdd„ ZdS )r zs This class provides a set of methods to read, parse and answer questions about a single robots.txt file. Ú c C s2 g | _ g | _d | _d| _d| _| |¡ d| _d S )NFr )ÚentriesÚsitemapsÚ default_entryÚdisallow_allÚ allow_allÚset_urlÚlast_checked©ÚselfÚurl© r ú)/usr/lib/python3.10/urllib/robotparser.pyÚ__init__ s zRobotFileParser.__init__c C s | j S )z·Returns the time the robots.txt file was last fetched. This is useful for long-running web spiders that need to check for new robots.txt files periodically. )r ©r r r r Úmtime% s zRobotFileParser.mtimec C s ddl }| ¡ | _dS )zYSets the time the robots.txt file was last fetched to the current time. r N)Útimer )r r r r r Úmodified. s zRobotFileParser.modifiedc C s&