User:Steven Abouchedid/sandbox

Python
The main convenience of Python is to be able to import modules into the text, allowing for an extremely wide range of functions for any specialization. These modules can be easily installed using Windows command prompt. Once this is done, all that is needed for the new resources to work is the import (module name) function at the top of any Python document.

API Calls
Calling API modules are fairly simple with Python. It increases with difficulty depending on the API and security, but relative to other languages Python makes the process of calling APIs very convenient. Using an example API and the following code, an API call was successful. This call was made to a sample GitHub API and the call prints three different requests. After using "pip install requests" in command prompt, this call was able to be successful on my machine.

SQL Database Connection Using PYODBC
To connect to the local MS SQL Server, a module called PYODBC was used, which allows for easy connection to any SQL server, as long as a the driver, server name, database, and any other security parameters were met. In order too connect to my local "Test" database, the following was placed at the top of the document. While this code will create a connection to the database, in order to retrieve data, it is necessary to use cursors.

Cursors
Cursors are structures that allow for traversal through the record set of data in a database. In order to use cursors in PYODBC, a cursor variable must be made. Then the SQL code is placed inside the cursor.execute function. In order to display the raw data, use this: While it is easiest to display raw data, it is also possible to display anything that the server has access too, such as the names of the tables, databases, etc. For example, the following function displays every table name, and checks if there is a table called 'Person Data'. The result will look like this:

API to HTML
This project is a culmination of all of the work that i have done at Agiline so far. The Github Page here has all of the code described below, if you want to run it yourself. Thank you.

Adaptable HTML Table from Information Stored in JSON from an API
The goal of the assignment was to take data from an API, convert it into a pre-made SQL table on an MSSQL server, and then plug the information from the SQL server into an HTML table with sorting, conditional formatting, pagination, and a search functionality. The main focus of this assignment was to make it as easy as possible for someone to edit data in either the API server, or the SQL server without needing to edit the code itself regarding the creation of data tables. This doesn't mean that the entire process is adaptable to any API, but rather that if the JSON data in the API were to be updated, the scripts would only need to be ran once again in order for the tables to be updated. This process was completed successfully and it took an understanding of 5 different programming languages: Javascript, HTML, SQL, Python, and CSS. While HTML, Javascript, and CSS all have heavy documentation and optimization in regards to their functionality when used in conjunction, Python and SQL took much more time to integrate in this process. The methodology is explained below.

Converting an API into a SQL Table
There are 3 major problems associated with this step. The first is getting the data from the API, the second is to format the data into a usable form, and the last is making the table from the data in the API. While it is possible to directly convert the JSON data into HTML much more easily, the assignment required this path to be taken.

API Get Request
In order to first acquire the data from the API, the requests module for Python was imported. The data was then converted into a Unicode string variable. The API used was a sample OpenWeatherMap API, giving data on a week of weather reports in Moscow in JSON. Here is the text received from the API. Here is the link to the API page which was called. The formatting of the JSON, including the names of given data keys was particularly important in parsing the data in the next step.

Parsing the JSON Data
Parsing the JSON data was rather difficult. Despite the fact that there is integrated support for JSON data structures in SQL, the version of SQL that i was using did not support JSON since that support was introduced in the 2017 version, and I was using the 2014 version. In order to circumvent this issue, I decided to parse the data in Python, treating the JSON as a dictionary in Python, and indexing for the pieces of data that I needed to make the table.

JSON Python Module
The first step was using the JSON Module for python, which is natively implemented into Python since version 3. Using the library from this module, it was possible to convert the data string from the requests module into a Python dictionary.

PYODBC and SQL Implementation
The function that we will create in Python doesn't create a SQL table, but rather implements data into an existing table. It is at this point where we need to have a table created before we move on further. I chose a few pieces of important data from the OpenWeatherMap sample API and created a SQL table. No data was actually implemented into the table, but rather the important data points were made into columns and given proper formatting. Keep note that the "WeatherID" column has no value in the original JSON, but rather counts up for each new row, giving them a unique ID. Having this data, we were able to properly format our 'SQL Query' from inside Python. However, it is important to explain how exactly I planned to run a SQL query in a Python script. I have explained this before in previous documentation, but to put it simply, it is possible using the PYODBC module. It allows a SQL query to be directly ran inside of Python using cursors. The goal now, is to construct a SQL query, using Python, that will automatically read the JSON and format the query based on the data it finds.

The Parsing Function
The format for updating a SQL table is as follows: It is important to note that the order in which the data is added in the format must match the order of their respective columns in line 1. To build this query, a string is created from a while loop in Python, which runs through the data and adds the proper pieces of information from each identified key to the string. This loop runs until there is no more data to add from the JSON. When the 'query' variable is printed, this is what runs in command prompt: In order to finally run the statement, commands from the PYODBC module are run: When this page is run, it updates the table in SQL with data from the API that matches the indexed keys in the parsing function. This is the complete Python script for the first file:

Moving the SQL Data to an HTML Table
This second step is also requires quite a few steps, as it involves taking the SQL data and converting it back into JSON, and the creating an HTML table using the JSON data as a DOM. I will not explain everything in the HTML file's formatting, as that is not the most important aspect of this project. I will be strictly talking about two major steps: Creating the JSON from SQL, and creating the HTML data layer and format layer.

Converting a SQL table into JSON.
Most of the work done in this section was done by Anthony Debarros. I used much of his function in my own, only adding a few updates to his formatting. In general, he uses the built in collections module, which creates new data structures to store collections of data. Along with this, I used the PYODBC and JSON modules as well.

Heading the Python Script
Here we import all the required modules for this new Python script and connect to the database, as well as establish some important variables. Note that the cursor.execute used a SQL query to grab all of the data from the table where I previously implemented my API data.

Building the JSON Dictionary
Next we use the 'OrderedDict' command from the collections module to help us create an ordered dictionary from the keys we choose. By inserting the Python dictionary into an object, we essentially have created a JSON from the data of the rows in our SQL table. The key references are all to do two things:


 * 1) Make a connection between the key objects in the dictionary and the rows of the SQL table.
 * 2) Rename the columns with the ability to use more characters.

Creating a .txt File Containing the JSON
Finally, we need to move our new JSON string into its own file. Luckily, python is able to natively build files: When the file is run, a new SqlWeather.txt file is created in the folder where the python script was run. It will look like this:

Creating the HTML Page
In my HTML page, there is a lot of work put into the formatting and Javascript functions that allow for a highly stylized table. In this section I will only explain the bare minimum of what I implemented so that it is understood how to use the .txt file that was just created and create a table from it.

Getting the JSON.txt file using AJAX
AJAX is a Javascript method that allows data from different pages and servers to be accessed in a Javascript file. We will use AJAX to access the data from the JSON that we just created in the same folder as our new HTML page. To start off, we need to create a function that will hold our AJAX request: This is the skeleton of every AJAX request. in the function 'IF' statement, we insert what we want to happen when our data is received. Now we have to tell the server what we want to do with our new information. Even though our function is only two lines, they are both supremely important. While the json2table function itself is imperative for the table to be generated, it was from another project before this one, thus I won't go into detail. The reference to how to build that is here and the CSS and Javascript styling can be found here.

How can this be Improved?
What is important to note about this project is that the source must be edited only if one wishes to change where the source data comes from. While it is convenient that updating the table doesn't require editing the code, there are a few areas that if attacked, could make the entire application work with any JSON or API.


 * 1) Find a way to create multiple tables based on the actual structure of the JSON. A JSON like the one created from the SQL data would only take one table to interpret, but the beauty of JSON is that the data can be deeply nested, requiring multiple tables to interpret.
 * 2) Interpret columns based on the Key Names. If the key names could be read, and those names implemented, it wouldn't be too hard to create a loop that would both be able to create a SQL table and update it with data from python. It would likely rely on the same function as the one used to interpret the nested JSON.
 * 3) Interpret data types for SQL. Because SQL is a language that is able to classify data in multiple different ways, the python function would need to be updated so that when it reads data, it is able to know what category of data SQL would interpret it as, and build the query as such.