MNIST libtorch practical exercise
Ready to work
First download the MNIST database, http://yann.lecun.com/exdb/mnist/
After downloading, do not decompress it with software such as winrar. For example, t10k-images-idx3-ubyte is decompressed into t10k-images.idx3-ubyte. It is best to decompress it with tar in Linux environment.
Suppose you unzip t ...
Posted by chrys on Wed, 25 May 2022 15:51:55 +0300
Running PyTorch code on GPU - neural network programming guide
In this episode, we will learn how to use GPU and PyTorch. We will see how to use the general methods of GPU, and we will see how to apply these general techniques to train our neural networks.
Deep learning using GPU
If you haven't seen the episode about why deep learning and neur ...
This article will share how to quickly run the interesting model on Hugging Face locally through Docker. Run the model with less code and less time cost than the original project.
If you are familiar with Python, most model projects can be deployed and run locally in about 10 minutes.
Write in front
In order to facilitate the display, I chos ...
From PyTorch 1.4 tutorial
Outline
Tensor
torch.autograd.backward
If the result node is scalar
If the result node is a vector
[PyTorch] Note 02: Autograd auto derivation
In PyTorch, the core of all neural networks is autograd package
1 Tensor
torch.Tensor is the core class of this autograd
A Tensor tensor usually records the following ...
Posted by rusbb on Thu, 19 May 2022 21:17:09 +0300
Overview of data processing toolbox
Pytoch involves data processing (data loading, data preprocessing, data enhancement, etc.), and the main toolkits and related relationships are as follows:
Overview of pytorch data processing toolkit
torch.utils.data Toolkit
1) Dataset: abstract class. Other datasets should inherit this class and contain ...
1, Foreword
Introduction to ECA-NET(CVPR 2020):
Thesis title:ECA-Net: Effificient Channel Attention for Deep Convolutional Neural NetworksThesis address:https://arxiv.org/abs/1910.03151Open source code:https://github.com/BangguWu/ECANet
As a lightweight attention mechanism, ECA net is actually an implementation form of channel attention mecha ...
Posted by phpflixnewbie on Sun, 15 May 2022 01:47:39 +0300
Recently, I will do line-level handwritten document detection work, merge CASIA-HWDB2.x (offline) data, and generate a page-level dataset with corresponding bbox. If you want to exchange ocr-related work, you can join the group (at the end of the article):
CASIA-HWDB2.x (offline) data set download address: http://www.nlpr.ia.ac.cn/databases/ha ...
Posted by nwoeddie23 on Sat, 14 May 2022 10:42:27 +0300
Note: batch here refers to mini batch
Two methods to realize sequence (text, log) batch processing
Fixed length batches (uniform length batches) All batch sequences have the same length. For example, seqs = [[1,2,3,3,4,5,6,7], [1,2,3], [2,4,1,2,3], [1,2,4,1]] batch_size = 2 Then the maximum sequence length is 8. If it is less than 8, fill it ...
Posted by sonic_2k_uk on Sat, 14 May 2022 05:21:32 +0300
thumbnail: https://image.zhangxiann.com/...toc: truedate: 2020/2/5 20:39:20disqusId: zhangxiancategories:
PyTorch
tags:
AI
Deep Learning
Code of this chapter:
https://github.com/zhangxiann/PyTorch_Practice/blob/master/lesson1/tensor_introduce1.py
https://github.com/zhangxiann/PyTorch_Practice/blob/master/lesson1/tensor_introduce1.py
Tensor c ...
Posted by sean14592 on Wed, 11 May 2022 06:42:57 +0300
Overview of this article: recurrence of knowledge- KG open source project set Medium BERT-NER-pytorch Some learning records after the project are of reference significance to Xiaobai, who is also a newcomer.
Data: about the introduction of transformer in BERT model, what must be shared is Animation of Jay Alammar , why didn't I see such a good ...
Posted by MartiniMan on Sun, 08 May 2022 05:09:16 +0300