You are viewing a preview of this job. Log in or register to view more details about this job.

Technical Data and Automation Analyst (Infrastructure and development)

Technical Data & Automation Analyst – Infrastructure & Development (Internship / Analyst Program)

Location: Remote  

Work Format: Deliverable-based, flexible scheduling  

Position Type: Internship / Analyst Program  

Classification: Part-Time / Temporary / Seasonal  

Program Duration: 8–12 weeks (aligned with academic term schedules and student availability)

---

Role Overview

Morsby, Gorman, McCarthy LLC is seeking Technical Data & Automation Analyst Interns to support infrastructure and development initiatives through scalable data processing, automation, and systems development.

This role is designed to support the technical layer of a multi-disciplinary data workflow by enabling the efficient extraction, processing, structuring, and organization of large volumes of information. The primary focus of the position is to assist in developing and implementing technical solutions that improve the speed, consistency, and scalability of data handling across projects.

Participants will work with high-volume datasets and source materials, including research reports, institutional documents, policy papers, and other structured and unstructured data sources. The role involves working with tools and scripting approaches to automate repetitive processes, assist in parsing large documents, and support the transformation of raw data into structured outputs.

This position is part of a broader workflow that connects research, data structuring, systems development, and strategic analysis. The Technical Data & Automation Analyst serves as the technical bridge that enables efficient data flow across these functions.

The role is designed to provide hands-on exposure to applied data processing, scripting, and workflow automation within a real-world project environment.

The technical scope of this role is intentionally educational and introductory to intermediate in nature. Participants will not be expected to build production-level systems or enterprise-grade infrastructure, but rather to contribute to guided, scoped technical tasks that support learning and development.

---

Core Purpose of the Role

The core purpose of this role is to support the technical processing and automation of data workflows by:

Assisting in the extraction of data from large and complex documents  

Supporting the development of repeatable and scalable data processing workflows  

Automating aspects of data cleaning, structuring, and organization  

Improving efficiency in handling large volumes of information  

Contributing to the creation of structured datasets from unstructured inputs  

Supporting the integration of data into internal systems and workflows  

This role is educational in nature and is intended to help participants develop practical experience in applied data processing, scripting, and automation.

---

Key Responsibilities

Technical Data & Automation Analyst Interns may contribute to activities including, but not limited to:

Assisting in extracting data from large PDF files and document sets  

Supporting the development of scripts to process and organize data  

Working with structured and unstructured data sources  

Automating repetitive data handling and transformation tasks  

Cleaning, formatting, and standardizing data using technical tools  

Supporting the creation of structured datasets from raw inputs  

Assisting in developing workflows for handling high-volume data  

Collaborating with Data & Systems Analysts to integrate automated outputs into structured datasets 

Identifying inefficiencies in current workflows and proposing technical improvements  

Supporting basic data parsing, text extraction, and pattern identification processes  

Documenting scripts, workflows, and data processing methods where applicable  

Depending on experience level, participants may take on more advanced automation or scripting tasks over time  

---

Example Project Applications

Participants may contribute to tasks such as:

Extracting structured data from large document sets  

Automating formatting or cleaning of extracted data  

Supporting workflows that process multiple reports into unified datasets  

Assisting in organizing outputs for use by research and data teams  

---

Expected Deliverables

Participants may be expected to produce outputs such as:

Basic scripts for data extraction or processing  

Automated workflows for handling repetitive data tasks  

Cleaned and structured datasets generated through automated processes  

Formatted outputs derived from raw document inputs  

Documentation of scripts, logic, and workflow processes  

Improved or optimized versions of existing data handling processes  

Deliverables will be scoped based on project needs and participant experience level. All outputs will be reviewed to ensure quality, accuracy, and alignment with project objectives.

---

Nature of the Work

This is a technically oriented internship role focused on applied data processing and workflow automation.

Participants will gain exposure to:

Working with large-scale document and dataset inputs  

Understanding how automation can improve data workflows  

Applying scripting or technical tools to real-world data challenges  

Supporting scalable solutions for processing structured and unstructured information  

Contributing to technical systems that support broader project functions  

The role combines independent technical work with collaboration across research and data teams.

---

Qualifications

Candidates for this role should ideally demonstrate:

Basic familiarity with programming or scripting concepts  

Comfort working with structured data and technical tools  

Strong problem-solving and analytical thinking ability  

Attention to detail and accuracy in handling data  

Ability to follow structured workflows and technical guidance  

Willingness to learn and apply new tools or methods  

Ability to work independently on assigned technical tasks  

Advanced programming experience is not required. Foundational familiarity with programming concepts or coursework exposure is sufficient.

---

Preferred Academic Backgrounds / Majors

This role may be particularly well suited to students from backgrounds such as:

Computer Science  

Data Science  

Information Systems  

Software Engineering  

Mathematics / Statistics 

Engineering disciplines 

Business Analytics (technical focus)  

Related technical or quantitatively oriented fields  

Candidates from adjacent disciplines with relevant technical skills are encouraged to apply.

---

Nice to Have

The following are not required, but may strengthen a candidate’s fit:

Familiarity with Python or similar scripting languages  

Exposure to data processing or automation workflows  

Experience working with structured datasets  

Basic understanding of handling large volumes of data  

Exposure to text processing or data extraction tools  

Familiarity with organizing or cleaning messy data  

Interest in infrastructure, development, or large-scale projects  

Exposure to tools or workflows used for parsing or processing documents  

---

Tools & Exposure

Participants may gain exposure to:

Python or similar scripting tools in a guided environment  

Data processing workflows 

Basic automation concepts 

Structured data handling techniques  

Document parsing and extraction approaches  

Integration of automated outputs into structured systems  

The role emphasizes practical application and learning rather than advanced software development.

---

Preferred Candidate Level

This role is best suited for:

Junior and Senior undergraduate students  

Graduate students  

Strong Sophomore candidates with relevant technical exposure  

Candidates should have at least introductory familiarity with technical or data-related coursework.

---

Who This Role Is Best For

This role is a strong fit for:

Students interested in technical data work or automation  

Individuals who enjoy solving problems through systems and tools  

Candidates interested in scripting, data processing, or workflow optimization  

Students seeking applied experience beyond coursework  

Individuals interested in scalable solutions for handling large datasets  

Candidates who are detail-oriented, logical, and technically curious  

---

Program Structure & Time Commitment

Program Duration  

8–12 weeks aligned with academic schedules  

Time Commitment  

Approximately 5–15 hours per week (target ~10 hours/week)  

Flexible scheduling to accommodate academic commitments  

Extension Possibility 

Extensions may be considered based on performance and project needs  

---

Compensation

This is an unpaid, educational internship designed for academic and professional development. The role focuses on skill-building, applied experience, and mentorship rather than paid employment.

This internship is structured as an educational experience and is intended to comply with applicable guidelines governing unpaid internships. The primary beneficiary of the program is the participant, with a focus on skill development, training, and academic alignment.

---

Educational & Academic Credit Framework

This role is designed as a structured educational experience. Participants will receive defined technical tasks, mentorship, and guidance.

The position is intended to complement academic study and does not replace paid employee roles.

This internship may be eligible for academic credit depending on institutional requirements. Participants should coordinate with their university.

The organization is willing to provide documentation and supervision confirmation where appropriate.

---

Learning Outcomes

Participants can expect to:

Develop practical experience with data processing and automation  

Gain exposure to scripting and technical data workflows  

Learn how to handle large-scale data inputs  

Understand how automation supports real-world projects  

Improve problem-solving and technical thinking skills  

Gain experience working within a structured technical environment  

---

Supervision & Program Structure

Participants will operate within a defined supervisory structure with assigned oversight.

They may receive:

Defined technical assignments  

Guidance on tools, scripting approaches, and workflow structure  

Support when working with unfamiliar technical concepts  

Feedback on outputs and processes  

Ongoing coordination with project leads  

Work will be reviewed to ensure quality, accuracy, and alignment with project standards.

---

Feedback & Evaluation

Participants will receive feedback on:

Technical accuracy  

Problem-solving approach 

Efficiency and workflow improvements  

Quality of outputs  

Ability to follow and improve structured processes  

---

Professional Expectations

Participants are expected to:

Maintain accuracy and organization in technical work  

Communicate clearly and professionally  

Meet agreed-upon deadlines 

Follow structured workflows and incorporate feedback  

Demonstrate accountability and reliability  

---

What You’ll Gain

Exposure to real-world data processing challenges  

Experience with automation and scripting concepts  

Development of technical and analytical skills  

Opportunity to contribute to scalable systems  

Professional mentorship and feedback  

---

Growth Opportunities

High-performing participants may be considered for:

Advanced technical responsibilities  

Extended roles  

Future project involvement 

Potential transition into longer-term technical positions  

---

Application Instructions

Apply via Handshake and submit:

Resume  

Handshake profile  

A brief 3–5 sentence statement of interest explaining your motivation and relevant technical experience 

---

Summary Positioning Statement

This role is intended to function as a structured, educational, technically oriented internship that provides hands-on experience in data processing, automation, and scalable workflow development. It is designed to support academic learning while providing practical exposure to real-world technical data challenges within infrastructure and development initiatives.