Senior Full-Stack Developer

When I was seven years old, I was given a Packard Bell computer and taught myself MS-DOS 5.0. This sparked an interest in programming ever since. Since then, I became proficient in JavaScript (Vanilla, React, & React Native), Java (SpringBoot), MongoDB, Python, & recently TensorFlow. Along with various other technologies such as Docker, Kubernetes, Jenkins, AWS, & more. My passion is putting ideas into practice with code.

Education
and
Training

My Work

Software/Toolkits

Textalytic

Browser based text analysis that handles pre-processing, analyzing, and visualization in an easy to use web interface. (Written in vanilla JavaScript, charts/graphs use D3.js & Chart.js, table uses Handsontable). Application is broken out into (5) node.js microservices.

Stylometry Toolkit

A browser based toolkit to identify stylistic signatures characteristic of Latin prose and verse using a combination of quantitative stylometry and supervised machine learning.

Anagram Toolkit

A browser based sequence alignment toolkit for the detection of anagrams in Latin literature.

Filum Toolkit

A tool for identifying verbal resemblances in literature written in JavaScript that uses FuzzySearch & Levenshtein Distance.

Work Experience

Under Armour (projekt202) - Sr. Developer

- Convert Monolithic Python Service to Modern Spring Boot Microservices architecture.
- Implement CI/CD with Jenkins, Docker, & Kubernetes.
- Support React.js UI Framework for Brand Challenges.
- Utilize tools such as Celery, Rabbitmq, Python, MongoDB, Prometheus, Kubernetes, Confluence, jira, and bitbucket.
- Meet with product & design teams for Sprint planning & retros.
- Meet with technical leadership for high-level architecture choices.
- Lead team of frontend, backend developers, Android developers, & iOS developers.
- Implement external APIS such as Segment, Snowflake, Amplitude, & Sales Force Marketing Cloud

UT Austin - QCL Lab - UI/UX & JavaScript Developer

- Research literature using machine learning, natural language processing, bioinformatics, and systems biology.
- Implement automated CI/CD with Gitlab & Docker
- Re-write filum tool which uses a technique derived from computational biology known as sequence alignment, which considers the character-by-character similarity of phrases in JavaScript.
- Collaborate with the web team on UI/UX.
- Automate large literature text file analysis.

Austin Software Works - Full Stack Developer & ML Engineer

- Proficient in PHP/MySQL | JavaScript/MongoDB | Meteor/Nodejs
- Develop/Debug/Test custom applications for business clients
- Use R for data analysis and hypothesis testing
- Deploy application code to AWS EC2/RDS Instances
- Commit code changes to Github via Git
- Create visual charts using D3 & tableau
- Create Tensorflow Models in Python
- Optimize NVIDIA CUDA Code & OpenMP for GPU computing.

UT Austin - Project Manager

- Oversee technicians for VoIP Conversion of 20k phones
- Modify/Configure Cisco Switches to accommodate new VoIP Phones
- Converted spreadsheet to efficient PHP/MySQL Data Entry Workflow
- Reduced survey time by 50% & deployments by 30%
- Assist in planning of best practices for site surveys
- Ensure surveys and deployments ran on schedule
- Migrate site surveys from Excel to a web based application

The I.T. Department - Head Geek

- Install & Configure CentOS/Ubuntu AWS Instances
- Full-Service I.T. Provider to local Austin small businesses
- Move physical servers to hosted solutions in AWS
- Address trouble tickets opened by customers via email
- Utilized tools such as VMWare, Git, HeidiSQL, Putty
- Install & Configure Cisco Routers, Switches & Access Points
- Implemented VOIP network using Cisco Appliances
- Configure and Manage Load Balancers and SSL Encryption Devices
- Facilitate communication and develop relationships for many cross functional teams

SolarWinds - Sales Engineer

- Determine the strategic needs of IT management for enterprises.
- Assist customers in designing network/storage system architecture.
- Demonstrate expert knowledge of vendor specific technologies including VMWare, Citrix, and Cisco.

G.S.A. - Computer Technician

- Deploy new Dell laptops to all of members of the GSA
- Backup all user data to fiber networked NAS drives
- Ensure encryption was set on all user machines
- Ensure computers could log onto encrypted network
- Ensure all user data was restored

Research

Estimation of potential United States influenza mortality. An agent based model simulation

Planning of the next influenza pandemic is of vital concern to many public health officials. The aim is to use an agent based model of influenza in the United States to estimate mortality should a pandemic occur present day. Models of infectious disease epidemics have been proven useful in understanding the dynamics of disease transmission, vaccination strategies, and are constantly used as an apparatus to support decisions made by public health officials. Using an agent-based model of Influenza A virus infection, a simulation is created to simulate the effects of two strains of influenza A causing a pandemic in the United States.

Filum: Large-scale identification of literary intertextuality using sequence alignment

Profiling of intertexutal relationships is foundational for literary study. We demonstrate Filum, a user-friendly tool that employs character-level sequence alignment to detect verbal parallels with or without lexical overlap. When applied to a database of more than 1,000 intertextual parallels of known significance from the Latin poet Valerius Flaccus, Filum is able to recover more than 80% with reasonable specificity and to identify more than 250 new intertexts previously unrecorded in the scholarship.

Computational Differentiation of Latin Literary Genre

This article describes a new quantitative approach to the study of Latin literary genre. Using computational text analysis and supervised machine learning we construct a detailed stylistic profile of the vast majority of extant classical Latin literature and classify works by traditional genre with high accuracy. By examining the statistical basis for these automated classification decisions, we identify salient stylistic characteristics of each genre at the level of syntax and non-content vocabulary. Through a series of case studies, we illustrate how this approach enables both confirmation at scale of long-appreciated stylistic tendencies and identification of unrecognized generic signatures.

A Stylometry Toolkit for Latin Literature

Computational stylometry has become an increasingly important aspect of literary criticism, but many humanists lack the technical expertise or language-specific NLP resources required to exploit computational methods. We demonstrate a stylometry toolkit for analysis of Latin literary texts, which is freely available at www.qcrit.org/stylometry. Our toolkit generates data for a diverse range of literary features and has an intuitive point-and-click interface. The features included have proven effective for multiple literary studies and are calculated using custom heuristics without the need for syntactic parsing. As such, the toolkit models one approach to the user-friendly generation of stylometric data, which could be extended to other premodern and non-English languages underserved by standard NLP resources.

Presentations

How to Do Philology with Computers - Boston, MA

Computational stylometry has aided the work of philologists for over 50 years. From simple word counts to the latest use of machine learning for authorship attribution, computation offers the literary critic a wide array of techniques to better understand individual texts and large corpora. To date, these methods have largely been accessible to specialists possessing a background in programming and statistics. The Quantitative Criticism Lab has now designed a user-friendly toolkit that will allow humanists with no prior training in the digital humanities to obtain a wide range of philological data about most classical texts and to perform sophisticated quantitative analyses—all using a simple point-and-click interface. This presentation will demonstrate some of the experiments and literary critical insights enabled by the toolkit, and discuss relevant issues of interpretation and statistical analysis.

Articles

How I built a complex text analysis app in a month

Published via freeCodeCamp - 05/03/2018

Natural Language Processing with Spacy in Node.js

Published via Medium - 03/24/2019

Load balance Node.js with Nginx without modifying your app.

Published via Medium - 03/24/2019

How I integrated the Instagram Private API in React Native

Published via Medium - 05/27/2020

Translating User-Generated Content in Real Time w/ AWS Translate

Published via Medium - 08/11/2021

Using Google Bard's API with Node

Published via Medium - 04/22/2023

Enhancing File Security in AWS S3

Published via Medium - 04/23/2023

Competitions

Machine Generation of Analytic Products

Finalist
The ODNI and OUSD(I) seek to uncover current capabilities and examine the feasibility of using natural language processing (NLP) and related artificial intelligence technologies to craft intelligence products with national security implications. This Challenge poses a representative question to be answered by respondents using an automated system of their own design.

Identify biothreats in real time

Pending
Successful concepts will explore connections between multiple readily-accessible data sources to develop real-time insights that can improve public safety responses to emerging biothreats. Warnings will ideally point to signals that fall within a zero- to ten-day time horizon of first instances of exposure using timely data sets that become available less than 36 hours after inputs are received.

Machine Evaluation of Analytic Products

Pending
The Seekers are asking Solvers to describe a viable technical approach for enabling the automated evaluation of finished intelligence products. Solvers must provide a well-supported, technology-based justification for how the proposed solution could—at a minimum—rapidly evaluate and numerically score brief (1-2 page) analytic intelligence products against the specified criteria with no human intervention.