⚝
One Hat Cyber Team
⚝
Your IP:
160.79.110.14
Server IP:
162.254.39.145
Server:
Linux premium289.web-hosting.com 4.18.0-513.11.1.lve.el8.x86_64 #1 SMP Thu Jan 18 16:21:02 UTC 2024 x86_64
Server Software:
LiteSpeed
PHP Version:
8.2.28
Buat File
|
Buat Folder
Eksekusi
Dir :
~
/
lib64
/
python3.8
/
urllib
/
__pycache__
/
View File Name :
robotparser.cpython-38.pyc
U e5d�$ � @ s\ d Z ddlZddlZddlZdgZe�dd�ZG dd� d�ZG dd� d�Z G d d � d �Z dS )a% robotparser.py Copyright (C) 2000 Bastian Kleineidam You can choose between two licenses when using this package: 1) GNU GPLv2 2) PSF license for Python 2.2 The robots.txt Exclusion Protocol is implemented as specified in http://www.robotstxt.org/norobots-rfc.txt � N�RobotFileParser�RequestRatezrequests secondsc @ sr e Zd ZdZddd�Zdd� Zdd� Zd d � Zdd� Zd d� Z dd� Z dd� Zdd� Zdd� Z dd� Zdd� ZdS )r zs This class provides a set of methods to read, parse and answer questions about a single robots.txt file. � c C s2 g | _ g | _d | _d| _d| _| �|� d| _d S )NFr )�entries�sitemaps� default_entry�disallow_all� allow_all�set_url�last_checked��self�url� r �*/usr/lib64/python3.8/urllib/robotparser.py�__init__ s zRobotFileParser.__init__c C s | j S )z�Returns the time the robots.txt file was last fetched. This is useful for long-running web spiders that need to check for new robots.txt files periodically. )r �r r r r �mtime% s zRobotFileParser.mtimec C s ddl }|� � | _dS )zYSets the time the robots.txt file was last fetched to the current time. r N)�timer )r r r r r �modified. s zRobotFileParser.modifiedc C s&