1. Data import and export
(1) Data import and export of csv file
# Import 1.csv data into the data variable
data = pandas.read_csv(
# file path
# Set the engine parameter so that the Chinese meaning in the path will not report an error
# set encoding format
# Data output
# de ...
Posted by EvilWalrus on Mon, 23 May 2022 07:22:19 +0300
Burpy is a plug-in that can open up the relationship between BurpSuite and Python. From now on, you can use your python to process Http packets arbitrarily!
Execute the specified python script and return the processing result to BurpSuite.
Function, UI introduction
Here's a note: use python2.
Specify your own python script in Burpy PY f ...
Posted by nielskg on Mon, 23 May 2022 05:57:28 +0300
Python Pandas is popular for its basic functionality. The pandas library has many essential basic functions and functions to make your daily work easier. It is strongly recommended for beginners to master the basic functions of Pandas.
Basic functions of pandas
Before starting Pandas basic functionality, you must learn to import libraries For ...
1, tcp basic syntax
The server communicates with the client through socket. In order to ensure data integrity, tcp protocol needs to shake hands three times, and can only communicate with one client at a time
Server side writing method:
#Import socket module
#Create socket object
#Bind server IP port
Posted by Kitkat on Mon, 23 May 2022 02:16:35 +0300
Recently, I am learning python crawler. When crawling some websites, I need to submit encrypted data, so I record the crawling process.
For your own study and archive.
1. Target website
The China Air Quality Online Monitoring and Analysis Platform includes PM2.5 and weather information data of 367 cities across the country, including AQI, ...
Posted by chris1 on Sun, 22 May 2022 23:56:59 +0300
Scrapy is well-known. Seeing that the mv of a website is good, it's too troublesome to download it manually, so we use scrapy to grab it.
The basic idea is to study the home page of the website, the list page of movies, and its playing page, obtain the formatted information in the page, the jump relationship from page to page, and finally obta ...
Posted by reyes99 on Sun, 22 May 2022 20:58:28 +0300
Web crawler (also known as web page) spider , network robot, in FOAF In the middle of the community, more often referred to as web page chaser), it is a kind of automatic crawling according to certain rules web A program or script that contains information. Other names that are not often used are Ants , automatic indexing, emulator ...
Posted by Blekk on Sun, 22 May 2022 18:28:14 +0300
Getting Started with the Requests Library
Seven main methods
Construct a request that supports the underlying methods of the following methods
The main method for obtaining HTML pages, corresponding to HTTP GET
The method for obtaining the header information of HT ...
Posted by busin3ss on Sun, 22 May 2022 16:59:39 +0300
1, Basic process of logistic regression
1. Logistic regression
Learn the law from multiple eigenvalues of a group of samples, establish a model (the learning model is abstracted as a formula f(x)), and use this model to predict the results of other samples. Logistic regression is to classify the prediction results.
Logistic regression steps:
Posted by McMaster on Sun, 22 May 2022 15:13:52 +0300
'''regular expression regular expressionRegular expression is a special character sequence, which can be understood as a stringBut this string is stronger than the original stringRegular expression usage scenario:Special characters are not allowed when registering and setting passwords. What are the number of passwordsWhen crawling, use regular ...
Posted by nomad9 on Sun, 22 May 2022 11:47:14 +0300