Sean Summers Senior Solutions Architect South Bend, Indiana 1-337-935-0003
Senior Solutions Architect with experience crafting and implementing cutting-edge AWS solutions using Infrastructure as Code (Terraform, CloudFormation). Designed (using Domain-Driven Design methodology), delivered and supported Python Data Quality Engine to various financial clients for reporting and compliance needs across several SQL stores, data mesh and data lake environments. Excel in Python, SQL, and scripting tools, particularly optimizing data and CI/CD pipelines (GitHub Actions, GitLab CI, Jenkins). Mentored engineering teams and played a pivotal role in migrating computing resources to AWS.
- LinkedIN: https://linkedin.com/in/sean-summers
- Python GitHub repo, training example of API authentication: https://github.com/seansummers/py-cxml
- Python GitHub repo, example SQL ZipApp: https://github.com/seansummers/cd1-report
- CloudFormation GitHub repo, various CloudFormation templates: https://github.com/seansummers/dominie
★★★★★ Devops ★★★★★ Unix Operating System ★★★★★ Database Design Principles ★★★★★ Data Modeling Techniques ★★★★★ Data Manipulation And Preparation Tools ★★★★★ Python ★★★★★ Data Pipelines ★★★★★ Ci/Cd Pipelines ★★★★★ Data Mesh Architecture ★★★★★ Domain-Driven Design Methodology ★★★★★ Python 1.2.1 to 3.12 ★★★★★ SQL, ETL and Data Warehousing (including Snowflake) ★★★★★ AWS DevOps, Security, Advanced Networking ★★★★★ Containerization, Functions as a Service ★★★★★ Security, Identity Management, Federation
AWS Senior Cloud Architect at Global Technology Solutionsmamar
Supported multiple teams in deploying development and production projects into AWS
Standardized all deployments to be Infrastructure as Code, defined in Terraform
Projects:
- custom Python vector index APIs (and batch embedding ingest)
- Python-based Large Language Models (LLM) (using Langchain and HuggingFace)
- Amazon Chime voice, video, and chatbots (using Python Lambdas)
- Amazon Connect (with Python Lambda integrations)
- Genesys Cloud CX integration with Amazon Lex/Polly chatbots (using Python Lambda)
Data Quality Engine Architect and Delivery Team Lead at [FiServ] Fiserv Technology Services (Apexon contract),
Projects:
- Air-gapped Python Data Quality Engine implementation, using approved Pydantic and SQLAlchemy libraries, and factored into a SOLID / hexagonal architecture to support future business requirements
- Used enterprise best practices and security guidelines to build a CI/CD pipeline with the tools available to deliver tagged code to deployment repos (GitLab, GitLab CI, Nexus)
- Integration with existing homegrown ETL process, utilizing Python and MongoDB, allowing import of the Data Quality Engine as a python library installable with pip
- Provided on-boarding and process training to on-shore Python developers to deliver the PoC project, and documentation and hands-on training with the off-shore team tasked with day 2+ operations and maintenance
Projects:
- Implement a new bank revenue application, using codat.io to load daily financial transactions from corporate borrowers and ML to determine daily remittance against obligations, on a tight timeline
- Designed a PoC using AWS Step Functions (Python lambdas) to maintain isolated authentication and state for each borrower account link, and Webhooks as a Service using EventBus endpoints to Python consumers (SQS, SNS, and Step Function subscribers)
- Provided configurations to build and deploy using the enterprise's GitHub repositories, Harness CI/CD, and SonarQube quality gating and review process.
- Unfortunately, due to technical barriers and business events, this project did not promote to production.
Projects:
- Migrate clearinghouse auction software from C#/Win32 to Python
- Assisted the deployment team in specification of requirements and provided Infrastructure as Code pull requests to the appropriate GitHub repositories to have assets provisioned in the Rancher managed Kubernetes on-prem cluster via Harness CI/CD
- Assisted the data management team in proper SQL definitions and Infrastructure as Code provisioning of Postgres instances in the Rancher managed Kubernetes on-prem cluster
- Provided Infrastructure as Code pull requests to manage and execute the team's data schema as SQL DDL executed by the enterprise CI/CD system and Flyway deployment
Data Quality Engine Architect and Delivery Team Lead at [USAA] United States Automobile Association (Apexon contract), San Antonio
Projects:
- Designed, coded and delivered a Python Data Quality Engine using enterprise approved OpenShift containers, python (3.6 only), and JFrog XRay managed libraries
- Built the first Talon Batch deployment into production in the enterprise, using ControlM to execute Python Jobs on the OpenShift cluster with access to both legacy SQL (Netezza) and future-state SQL (Snowflake)
- Trained a 4 member team for Day 2+ management and maintenance of the engine and handed over the codebase Consulted for the Domnio Data Lab Python Notebook cluster migration to AWS, with a desired future state of replacement with AWS SageMaker
Project:
- Due to FFIEC requirements, all API traffic between on-prem and AWS is required to be encrypted and authenticated with auditable proof Assisted with the coordination of on-prem Kubernetes configuration of Google Apigee gateway and AWS hosted Apigee gateways, with the goal of proxying all API traffic between sites
- Provided an nginx implementation as a PoC of the architecture, and assisted in the Ansible configuration of custom Linux images and an air-gapped install and configuration of Google Apigee
- Provided on call support to the engineering teams to release project blockers in critical sections of the deployment schedule
Senior Site Reliability Engineer at WAITR, Lafayette, Louisiana
Projects:
- Managed the AWS Account constellation, migrating from manual deployments and instance-based MySQL and PostgreSQL stores to AWS Elastic Beanstalk and RDS Aurora MySQL and PostgreSQL
- Created CI/CD deployment pipelines using AWS CodeBuild, AWS CodePipeline, and CircleCI for GitHub repos providing Infrastructure as Code, data store, and API Swagger (OpenAPI) deployments into AWS API gateway.
- Refactored monolithic application into AWS ECR hosted containers deployed into AWS Elastic Beanstalk to AMD instances with algorithmic deployment based on traffic and load, providing no-event "stampede days" elasticity through auto provisioning and auto scaling
- Centralized all logging into AWS CloudWatch Logs and subsequent collection into AWS Kinesis Firehose endpoints for Athena querying of Parquet partitioned data at rest
- Provided a Python Lambda Step Function PoC to integrate enterprise HR systems with external web-hooks to automate talent acquisition efforts, decreasing the latency between job posting and position fulfillment
- Created a Python based AWS S3 subscriber system to integrate with Paylocity CSV for payroll information, significantly reducing the cost from the legacy EC2 instance running cron jobs
- Provided a CloudFormation managed pilot-light deployment and AWS CloudFront proxy for all production traffic, allowing instant in-flight redirection of traffic in the event of disaster or regional failure based on Route53 health checks. This also provided significant savings over legacy non-CDN traffic
- Negotiated an AWS Enterprise Agreement for custom pricing (requires a minimum of $100,000 MRR to AWS) for the company AWS accounts
Senior Cloud Engineer at Trek10, South Bend, Indiana, US
As an AWS Premier Tier Partner, Trek10 is in the top 10% of all AWS partners globally. As a Senior Cloud Engineer on the Professional Services team, I was involved with:
- Engagements with AWS Professional Services, where AWS provided on-site technical resources that coordinated under a Trek10 engagement contract to deliver our world-class serverless solutions (almost exclusively Python)
- Provided engineering for several AWS QuickStart templates published by AWS to provide best-practice and PoC deployments of several services, including AWS CloudFormation stackset integration with Jenkins CI/CD, AWS CodePipeline solution for multi-party approval flows, and several Lambda serverless event triggered examples
- Worked on-site with customer engineering teams and subject matter experts to overcome technical resistance on the Cloud Adoption Journey and provided expertise and oversight on solutions while ensuring the customer's access to relevant technical and educational material (some covered under NDA with AWS)
Senior Systems Administrator, Enterprise Systems Unit at Hesburgh Libraries of the University of Notre Dame, Notre Dame, Indiana, USA
- Lead Architect for the ND CloudFirst Initiative, a 3 year project to migrate 80% of computing resources to AWS Mentored the existing Enterprise Systems Unit and library engineering teams, with the outcome of almost 50% AWS Certification across the departments
- Infrastructure as Code using Cloudformation and Ansible exclusively for all deployments, as humans are not allowed to taint production accounts
- Wrote the guidelines and runbooks, with automated CI/CD pipelines implementing deployment in AWS for all library assets published to AWS
- Hesburgh Libraries' lead representative to the OIT and CloudFirst organizations, participating in several cross-campus initiatives and rollouts of AWS technology, from initial use case, through approval, and building CloudFormation guardrails for automated deployment, with the desired future state of integrating with ServiceNow
- Used AWS PrivateLink to integrate an AWS SQS queue with the campus Talend deployment, allowing newly issued ID badges to transact business in the library simultaneously with printing (significantly improving on the target SLO of 10 minutes)
- Used Apache NiFi to integrate with Indiana's legacy ILLiad borrowing system (ILLiad has no API, only a web based UI), allowing instant automated synchronizing between the legacy library catalog system and external Indiana institution's assets, saving almost a week a month in labor costs ongoing
Schreiner University, Kerrville, Texas, USA 150+ credit hours towards Mathematics and Philosophy
member of the University's inaugural Honors program cohort
- November 2018 – November 2024 AWS Certified Developer - Associate, AWS ID WjFF9DY12JB41N9X
- July 2017 – February 2021 [DOP] AWS Certified DevOps Engineer - Professional, AWS ID 763XEBHK12VEQR5L
- May 2018 – May 2020 [ANS] AWS Certified Advanced Networking - Specialty, AWS ID S86FKHE2LF41Q3C4
- July 2015 – July 2018 Certified Ethical Hacker v8, EC-Council ID ECC26959285350
- December 2000 – May 2013 [MCSE+I] Microsoft Certified Systems Engineer + Internet, Microsoft ID A010-7856, MCP 276427