- Create an IAM user
- Create, Read, Update, Delete a Users table
- Add test coverage for our new Dynamo and Users helper
- Add the project to git
In the last article we talked a little about AWS and hopefully you had a chance to set up a free account so we can take advantage of their Python SDK and the highly scale-able services. If you enjoy this series please share this article!
Create an IAM user
Before being able to use the highly scale-able nosql database, DynamoDB provided by AWS in our service we’re going to need a user that can access your cloud services in your application.
First login to aws and find the IAM service, create a user with any username you want. AWS also suggest that you disable root access to your account and then provision out the services to this new user you will create. For a free tier account this isn’t that important but if you are running IT for an organization that is very important to consider your access architecture.
Checking programmatic access will generate a access key and secret key pair which we will use from our web application to access the services our demouser has permission to use.
Clicking AWS Management Console access will let us sign into a special instance of AWS for our user based on the root account configuration. This AWS console link should be saved somewhere or emailed to whoever you generate accounts for.
After clicking next we will be brought to the section where we assign permissions. For the scope of this project we will give this user full unrestricted access to DynamoDB since this is the main service we will be using for the tutorial. It’s important to restrict it in this way because if they keys or login get compromised they will only see our tables and won’t be able to spin up other services on our account. In software it’s always important to consider what could happen if keys get lost, data gets intercepted, or what will a user do to our system that can break the application.
After the review, we will be brought to the complete page which will show us the access key and secret key. It is very important we copy these somewhere, preferably in our home directory in a hidden .aws folder and a file named credentials. This will enable the AWS Python SDK to auto load the key value pair from that file in our development environment.
~/.aws/credentials
[default]
aws_access_key_id = AKIA**********
aws_secret_access_key = dEBST*******************************************
Apple iPad (Wi-Fi, 32GB) – Space Gray (Latest Model)
Please help this site by purchasing through our Amazon links
CRUD (Create, Read, Update, Delete) from the Users table
Now that we have an IAM user that can access Dynamo, we can start making our Users table and make our web application a little more interactive and personal. To structure the code we are going to make a controllers folder under the app folder to add all the code that our backend will directly communicate with. Then create an __init__.py and users.py files in this folder.If you’re new to the article get a snapshot of the code with git:
git clone git@github.com:stefbrad15/PythonFlask.git cd PythonFlask git checkout aa8c7dd62bd784d267970e42368f9fdc64ab0a00
If you’re unfamiliar with git I talk about it at a high level at the bottom of the article
Comsci tip
The controllers reference comes from the MVC design pattern where a GUI based application is broken into three main layers; Model, View, Controller. The model is the data structure which holds the data and has helpers and services for the business logic, the View which holds the UI pieces (in our Flask App, the View part would be the front end which has it’s own MVC pattern associated with it), and lastly the controller layer which is a thin layer to link the View and the ModelsPython tip
Adding an __init__.py file to a folder will have python interpret that folder as its own module. Putting imports in this file will expose those functions as part of that module which are accessible from folders higher up or at the same level. These modules can even be broken up further and added to the main python package repo so that they can be installed via “pip install your-package”. I haven’t actually done that, but know it can be done.Anyways with our controllers/__init__.py and controllers/user.py we can expose functions or the entire submodule from users.py into __init__.py to use in other parts of our app by adding “from controllers import function” where we need this. For our immediate need we should add “from app.controllers import users” to app/__init__.py . This will add the routes defined in our users controller to be sourced when we run our Flask object. We will also add “import users” to controllers/__init__.py to allow the reference to be sourced.
We should also create a “helpers module” in our app for random business logic code so that we can have our app follow good software design principles such as SOLID and DRY code where we “don’t repeat yourself” and try to keep your files to have a single responsibility which should be open to extension and close to modification once the class has been finished. In my experience since python is duck typed it’s a little harder to follow true object-oriented programming principles but the better we get at it the more sustainable your application will be as it grows.
Below is our users controller with a route to return a single user and a route to return all users. Since we don’t have the UI set up to create a user we will leave that for the next part where we program a UI to register and login a user.
app/controllers/users.py
code here
Main callouts are the imports from the app.helpers modules which have yet to be created. These files help keep our controller functions pretty slim and more readable. It also allows us to add our test coverage on the core logic itself which you’ll see later in this article.
Also in our get_all_users function you can see we are checking for the existence of a users table and if it isn’t there we create one. We also will seed the table in our helpers file with two demo users.
Ideally this should go into a migrations folder with a top level script for setting up the database dependent on your environment but I didn’t set that up before committing the code. We can refactor this out and set up those migrations when we add another table where we will get more use out of it and have a chance to think through the design.
Helpers module code:
code here
Main callouts:
- The Dynamo Interface class is automagically loading the AWS credentials from either the ~/.aws/credentials file or from environment variables that have been pre-configured. If these were not there the code would fail in a spectacular light show of doom and strife. If you would like to use a different way to manage your AWS keys then you will have to inject these into the DynamoInterface constructor and provide them to a connection object and use that when calling Table()
- The DynamoInterface is also using Boto2, to get this for your project just “pip install boto” for all your fun AWS needs. Read the docs
- We are defining a read and write throughput of 2. Which will allow for that many actions to dynamoDB per second, if you have tens of thousands of users on your site this will cause severe bottlenecks on your data-access but in a development world this is plenty. AWS charges based on the amount of RCU/WCU you assign to a database and how much data you have in it. Keeping this small and increasing it as your application scales will help keep costs manageable if you are not in a free-tier account. Ideally the monitization of your app should more then cover this cost
- The encrypt password method is extremely important as passwords should never be kept in a database as plaintext should users be sharing passwords across sites. They should also only be oneway encrypted. Adding “salt” to your password which is a generic string to increase the length of a password will make it harder for malicious users to break into someone’s account.
- Lastly, we are doing our own user management as a teaching and learning exercise. There are services out there such as OAuth which will enable you to handle your authorizations with facebook, google, etc … logins but you will still want to store specific user data to your site in your own table which can optionally handle your own logins as well. Going the OAuth route will likely increase your user base as users won’t have to spend time ‘registering’ to interact with your site.
With those two modules in place we should be able to run the flask application from the top level of our application with “python run.py” and we can go to 127.0.0.1/api/users to create our table and return our two demo users which can then be rendered in the UI to display in a pretty list. We can also test our get user route http://127.0.0.1:5000/api/user?username=johndoe&password=JohnPass123 to see if this returns the data specific to john. This will be important for logging in our user in the site and creating a session. In the next article we will refactor this to be a post route and store an active session key in our db.
You can also log into your AWS account, go to DynamoDB, and see that the table and item has been created there. This is another nice advantage of Dynamo is to have remote UI access to your tables to increase manageability.
Add test coverage for our new Dynamo and Users helper
Testing these two routes in the UI is all fun and games for now to see that our application is actually doing something and interacting with AWS but during the lifecycle of the application this will become tedious and a time sync to test all our functionality with every little change. Also some of the functionality we shouldn’t be testing from GET routes such as creating new users and deleting users but we want to know we can do this. For this we will bring in pytest which the upfront time to create will save us huge time down the road and serve as source for new developers of the code base to learn the behavior of the application.
If you don’t all ready have pytest installed you can do with “pip install pytest”. When pytest is ran it will recursively walk through the directory looking for files that match the “test_*” or “*_test.py” naming conventions and run those pieces of code checking assert statements to pass or fail the code. It will also output any logging statements for failed tests to help trace what might have went wrong. Other modules can also be set up with pytest if you want to get hard core into your development operations that will output html pages documenting your test coverage or running your tests on every modification to your code. Very cool stuff as applications get larger with multiple devs in the team on the project but for now it’s a bit overkill.
Anyways you will see the test I set up for our users table below.
code here
Creating the fixture object allows us to inject that object into functions easily so we don’t have to set that up everytime. We could also breakout the table.get_item code into another fixture too should we add more tests to this later. This helps keep our tests readable, self-documenting, and DRY. Running pytest should now give a test report with hopefully 6 passing tests.
Add the project to git
To help share the project with people I have posted the project in git and will be adding tags to the project to help keep everything up-to-date. A tag in git is a human-readable link to a commit which can also be associated with releases.
You can get a copy of this release here.
If you are unfamiliar with git I highly recommend spending a day or two studying up on it, creating a repo, and commiting/reverting changes. This is a highly effective tool that is constantly being updated and maintained through the open source community which allows for easy collaboration and sharing such as that release link above.
High level rundown of git
- Create an account on github
- download git in your development environment
- Linux: sudo apt-get install git
- Windows: https://desktop.github.com/ (Note: make sure to install git-bash, it’s pretty cool)
- Configure git
- git config –global user.name “username”
- git config –global user.email “email@place.com”
- Make a repository on github to serve as your remote. Follow the instructions there to push your code to this repository.
Another great reason to use git is that you can use your commit history as a sudo portfolio of your coding shenanigans. If you are active in open source or pushing your side projects and have a nice series of commits of code that shows improvement this can even be added to your linked-in profile and recruiters will eat that stuff up. This could help you get into some companies you might not have even knew were on your radar.
Closing statements
Thank you for sticking it this far. This has easily been one of the most detailed blog articles I’ve written so far and thinking that it will help some developers out there progress in their career or build on their passion for web application development made it a blast to write. If this has helped in anyway please leave a comment below as I’d love to hear your progress, how I can improve this series, or even just your plans on what you’ll do with your flask app. Have a great day, until next time.
~Stefan~