Skip to content

Commit

Permalink
Update requirements/readme and fix code for new libs
Browse files Browse the repository at this point in the history
  • Loading branch information
Ric Harvey committed Sep 21, 2016
1 parent aa8d3e2 commit 896ee77
Show file tree
Hide file tree
Showing 3 changed files with 10 additions and 6 deletions.
10 changes: 6 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,13 +11,13 @@ I hope some of you find this useful.
Make sure all these python modules are intalled:

+ BeautifulSoup
+ urllib2
+ urlparse
+ urllib3+
+ urlparse3

example:

```bash
sudo pip install BeautifulSoup
```bash
sudo pip install -r requirements.txt
```

### Usage
Expand All @@ -30,7 +30,9 @@ To get all documents:

Files that exist on disk will not be re-downloaded (so by default only new sections/files are downloaded). To override this default and force re-download of files that exist on disk, use

```bash
./getAWSdocs.py --force
```

Thats it!

Expand Down
2 changes: 1 addition & 1 deletion getAWSdocs.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@

# Build a list of the amazon service sections
def get_services():
html_page = urllib2.urlopen("http://aws.amazon.com/documentation/")
html_page = urllib.urlopen("http://aws.amazon.com/documentation/")
# Parse the HTML page
soup = BeautifulSoup(html_page)
urls = []
Expand Down
4 changes: 3 additions & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
@@ -1 +1,3 @@
BeautifulSoup==3.2.1
BeautifulSoup
urllib3
urlparse3

0 comments on commit 896ee77

Please sign in to comment.