Build a Data-Driven Web App with Flask

Background

  1. There are lots of cool codes and blog posts... but no way to try them unless you run the code yourself
  2. The need to create a prototype/minimum viable product (MVP) that even non-technical users can use

  3. $ git clone ...

    $ jupyter notebook 
From this:
Index
To this:
Details

Why Python?


  1. Powerful data analytics libraries (pandas, numpy, scikit-learn...)
  2. ... and web development frameworks (Flask, Django...)
  3. Great community support
More time to focus on getting insights from your data and putting it out there

Flask

A microframework for Python

Great intro tutorials to Flask:

  1. Flask
  2. Flask Mega-Tutorial

... but why Flask?

Flask is simple and flexible:

Example

However, it's always a good idea to assess what your application needs.

Feel free to follow along!

http://pyconid-demo.herokuapp.com

https://github.com/galuhsahid/pyconid2017


Based on "Estimating the Collected Funding Amount of the Social Project Campaigns in a Crowdfunding Platform"
Galuh Tunggadewi Sahid, Ivana Putri, Intan Sari Septiana, Rahmad Mahendra. ICACSIS 2017

Let's get started!

  • Input data

    From user? External data?
  • Prediction

    Done using the model we have built
  • Output

    Display the result of our calculation and other stats

Tip #1: Think from the perspective of the users. What do they need?

/resources

Your model and other files needed to process your data

/static

Where the stylesheet, JS files, and images go

/templates

Where the HTML files go

WebSlides Files
Input: data.py

Get data from campaign page and Facebook

Prediction: predict.py

Predict the amount of campaign donation based on scraped campaign data and Facebook data

Output: graph.py

Display a graph that explains how fundraisers can increase their predicted collected donation

Combining everything together: routes!

Index Details

Getting user input

templates/index.html
templates/index.html
routes.py
routes.py

Data Collection

Data from the campaign page

Web scraping using Beautiful Soup 4

Engagement data from Facebook

Facebook Graph API

routes.py
routes.py

Tip #2: Keep in mind the type of your data

Prediction

Libraries

We are using scikit-learn but depending on your app, you can use plenty others (LibSVM, Gensim, Tensor Flow...)

Make your model persistent

Pickle, joblib*

routes.py
templates/index.html
predict.py
templates/index.html

Data Viz

Libraries

Matplotlib, Seaborn

How do we generate a plot dynamically and display it in a page?
routes.py
routes.py
graph.py
graph.py

Recap ‐ what do we have so far?

from get_data()
data =
{
  "title_wc": title_wc,
  "short_wc": short_wc,
  "story_wc": story_wc,
  ...
  "prediction": prediction,
  ...
  "fb_reaction_count": fb_reaction_count,
  "fb_comment_count": fb_comment_count,
  "fb_share_count": fb_share_count,
  "collected_amt": collected_amt
}

Recap ‐ what do we have so far?

from display_fb_shares()
  fb_shares =
  {
    "plot_url": plot_url,
    "target_bin": target_bin, 
    "target_bin_avg": target_bin_avg
  }

Display it to the user

routes.py
graph.py

Display it to the user

Most of the time, everything is pretty straightforward
templates/details.html
routes.py

Display it to the user

Displaying graph

RFC2397

templates/details.html
routes.py

Display it to the user

Conditional
templates/details.html
routes.py
static/style.css
routes.py

Display it to the user

Conditional
routes.py routes.py

Display it to the user

Custom filters
util/filters.py
routes.py
templates/details.html
routes.py

Flask is capable of much more

That's it. 🎉 Thanks!