Section breakdown:

  1. Import/Write CSV
  2. Import Excel
  3. Import HTML (scraping)

Import/Write CSV

import pandas as pd
df = pd.read_csv('example') #import csv
df.to_csv('My_output',index=False) #write to csv, don't include the index column

Import/Write to Excel

import pandas as pd
pd.read_excel('Excel_Sample.xlsx',sheet_name='Sheet1') #import Excel
df.to_excel('Excel_Sample.xlsx',sheet_name='Sheet1') #write to Excel

Import HTML

Required Libraries, assuming Anaconda is installed

conda install lxml
conda install html5lib
conda install BeautifulSoup4

In this case I’m going to reference a table found below on the website:

import pandas as pd
data = pd.read_html('')

Notice that this reads in every table it can find within the website as a list of dataframes, you can explore the tables it picked up by viewing data[0], data[1], etc.