Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Senior SQL Server DBA: AN Torm

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 9

DAN STORM

EDUCATION

6/99 University of Chicago


Master of Business Administration
Concentrations: Finance, Accounting, Strategic Management

8/94 University of Texas at Austin


Bachelor of Business Administration
Major: Management Information Systems Minor: Finance

EXPERIENCE

Stearns Lending – Santa Ana, CA


Senior SQL Server DBA
5/15 – Present Industry: Mortgage Lending
Mandate: Stabilize SQL Server infrastructure, implement High Availability and Disaster Recovery topology,
and assist with migration of mission critical applications
Technical: SQL Server 2005 to 2016. There are 120 SQL Server database instances solely managed by me with
all production Tier 1 databases in seven separate AlwaysOn group in both active/passive and
active/active topologies.
 Conducted annual Disaster Recovery testing
 Implemented standardized library of stored procs and UDFs on all instances to automate nightly and weekly
database maintenance and as well as nightly full and intra-hour transaction log backups
 Plan out and implement High Availability and Disaster Recovery using AlwaysOn (synchronous for HA and
asynchronous for DR) for SQL 2014 and log shipping for SQL 2008 R2
 Plan out and implement reporting topology using real-time transactional replication as well as log shipping
 Troubleshoot and performance tune OLTP processing during business hours and BI refresh processing during
after hours
 Collaborated with Information Security group to lock down and secure database access user base using Active
Directory groups
 Worked with DEV, QA, and PMO teams to apply upgrades and fixes to upper environments
 Recruited to build out internal DBA team

Arbonne International – Irvine, CA


Senior SQL Server DBA
8/13 – 5/15 Industry: eCommerce, Multi-Level Marketing Cosmetics
Mandate: Stabilize SQL Server infrastructure, migrate databases to new third-party e-commerce solution, and
provide day-to-day database operation support with a team of three database developers and
administrators
Technical: SQL Server 2000 to 2012. There are 55 SQL Server database instances that are either hosted at co-
locations or on Amazon’s AWS cloud
 Achieved and sustained 99.999% availability of database services for e-commerce portals for all SDLC
environments – PROD, UAT, QA, and DEV
 Engaged with project managers, business analysts, and developers to re-host Arbonne’s revenue generation e-
commerce website from an archaic ASP based infrastructure to a third-party platform (Prowess by Thatcher
Technology Group) which supports multi-level marketing. After 18 months of preparation, planning,
development, and testing, the migration was completed on March 2015 with complete success.
 Worked with DEV, QA, and Ops teams to address performance issues in ASP infrastructure which impacted
order entry during month-end load. Previously would crash for hours each month-end due to unoptimized
queries maxing OLTP server’s CPU at 100% utilization. Now, there are no disruptions to OLTP services.
 Implemented verbose logging in batch process stored procs to identify slowest steps needing optimization
 Actively guided third-party monitoring services and off-shore resources to ensure 24/7/365 uptime
 Recruited to build out internal DBA team
 Collaborated with application administrators and server administrators to migrate databases to beefier
instances as needed. Spun up and documented each new instance.
 Stood up and balanced active/active database clusters
 Conducted IOMeter testing to verify performance of AWS and SAN storage
DAN STORM

 Developed backup and restore stored procs to automatically refresh DEV and QA environments on as-needed
basis with PROD data and then redact any sensitive information (e.g., passwords, SSNs, credit cards, etc.)
 Designed, developed, and implemented Product Search caching using rolled-up tables with de-normalized
look-up product information frequently searched by customers thus releasing significant I/O and CPU
resources
 Refined and troubleshot hourly order flow DBMail reporting
 Identified, escalated, and advised on non-SARGable aspects of resource-intensive queries previously
deployed by developers leading to month-end peak stoppages
 Implemented weekly processes to either groom or move old data into archive databases in order to keep
OLTP databases lean thus allowing for faster CRUD operations as well as saving nightly backup storages
 Implemented previously developed library of stored procs to automate nightly backups, transaction log
backups, index defrags and rebuilds, statistics updates, integrity checks, and database shrinks on all instances
 Developed and implemented audit trail triggers to identify mysterious updates to SKU Item Master tables
 Managed Idera diagnostic solution and mined perfmon repository for information pertaining to HDD burn
rates, CPU% utilization trends, slow-running queries, I/O levels, deadlocks, table blocks, wait statistics, and
various other system state information.

US Real Estate Services - Lake Forest, CA


Senior SQL Server DBA
2/11 – 8/13 Industry: Real Estate, Software-as-a-Solution Provider
Mandate: Lead DBA team through on-going growth of the firm's RES.NET web portals. The firm's customers
of its SaaS solution suite include financial institutions who use the RES.NET to manage REO and
Short Sales properties as well as eviction attorneys, real estate venders and real estate agents tasked
with managing and then selling residential and commercial properties. RES.NET is a cloud
computing solution which leverages business process workflow management. RES.NET has many
solutions -- some are mature "cash cow" products, many are new products designed to take
advantage of niche opportunities in the real estate transaction space. There are 5.5 FTE resources (5
full-time, 1 part-time) allocated to helping the firm evolve and grow revenue. I lead these DBA
resources.
Technical: Windows Server 2008 R2 64-bit environment with Fibre Channel SAN storage involving
approximately twenty SQL Server 2008 R2 database servers. There are also approximately twenty
MySQL 5.2 database servers.
 Achieved and sustained 99.999% availability of database services for SaaS portals (outside of deployment
windows) for all SDLC environments -- PROD, UAT, QA, and DEV
 Planned, designed, and implemented real-time transactional replication processes within SQL Server as well
as between MySQL and SQL Server. (The latter was achieved with an in-house custom solution which I had
designed and coordinated using a MySQL DBA and a SQL Server DBA.)
 Planned, designed, and implemented infrastructure to support the needs of the firm's lead Business
Intelligence and SSRS developer. This included procuring and deploying new servers, configuring SQL
Server components, and monitoring performance and tuning server hardware as well as optimizing queries.
 As lead DBA, collaborated with project managers, .NET and Delphi application developers (both on-site and
overseas), and other DBAs on the database team to develop and deploy new solutions as well as deploy bug
fixes and enhancements on a 2-week Scrum cycle
 Interacted directly with customers' technical and project management to work out inter-firm ETL processes.
 Implemented optimization processes and email alerting to aggressively locate and mitigate slow running
stored procedures, severe table blocking, and unnecessary/unused indexes

Sheppard Mullin Richter & Hampton LLP - Los Angeles, CA


Senior DBA (SQL Server) and DMS Admin (dual role)
7/08 - 2/11 Industry: Legal Services, Business Law
Mandate: Co-manage production databases and data server infrastructure, provide 100% availability for firm’s
attorneys and administrative staff (approximately 1,100 employees of which 550 are attorneys)
throughout ten national offices and a Shanghai office.
Technical: Windows Server 2008 64-bit environment with iSCSI network SAN storage involving approximately
thirty-five SQL Server 2008/2005/2000 database servers. Applications supported are a combination
of third-party legal, financial, and document management systems (Autonomy iManage and kCura
Relativity) as well as in-house customized web-based applets and data-feed processes.
DAN STORM

 As the firm’s sole Document Management System (DMS) Administrator, upgraded and managed the multi-
office, multi-nation server and application infrastructure which hosts the firm’s Tier 1 repository solution for
legal documents (WIP, closed, and archived cases). This responsibility became mine in addition to my other
Senior DBA responsibilities after the prior DMS Admin was summarily released in the middle of several on-
going critical projects.
 As the firm’s sole DMS Admin, planned and deployed the disaster recovery solution using InMage replication
between Phoenix and Los Angeles. Also planned and deployed same-site server redundancy using Microsoft
clustering services. All work was done remotely from Los Angeles via collaboration with Phoenix technical
resources, thus negating the need for travel expenses. My lead role in architecting and implementing multiple
levels of server redundancy has since then resulted in 100% availability of this work product generation
system.
 As lead technical liaison, collaborated with business development team to improve on existing LexisNexis
InterAction CRM platform in an effort to integrate internal data and leverage existing client base for greater
revenue
 Designed, implemented, deployed Compuware ServerVantage e-mail alerting and dashboard based time-
series monitoring components, thereby reducing daily database incidents from two about occurrences per day
to no occurrences per day
 Assembled and spun-up HP Proliant DL385 servers as fully operational file and database servers using a
combination of local RAID and iSCSI network SAN storage while collaborating with the firm’s Network
Infrastructure Architect on matters relating to optimized network throughput and storage deployment
 Worked with the firm’s Litigation Support application administers as well as third-party application
developers and engineers to address and fulfill end-users’ business requirements; this includes reviewing and
deploying upgrade scripts and components on the database servers
 Responsible for data access and network security relating to the database environment
 Collaborated with web developers to identify and optimize problematic database queries
 Worked with “lateral” attorneys and remote staff personal to import case matter items from attorneys’ prior
firms into our own Interwoven DeskSite document management system.

FutureTrade - Lake Forest, CA


Senior DBA (SQL Server)
7/06 - 6/08 Industry: Capital Markets Global Trading, Equities and Options
Mandate: Manage production databases and data server infrastructure, provide 100% availability for client base
of global institutional traders using firm’s fully integrated trading and order processing platform, and
facilitate growth of business by enabling firm to strategically collect, process, and leverage data.
With a Product Management team in NYC, offshore developers in Eastern Europe, and QA and
Technology Infrastructure teams in California, the firm leverages global talent to produce a highly
competitive and highly reliable application suite for institutional traders seeking an integrated
platform that offers a single point of access to multiple ECNs, exchanges, broker/dealers, and
algorithmic trading providers using Financial Information eXchange (FIX) protocol. The firm
employs a rigorous documentation process to capture new requirements, bugs, enhancement requests,
production changes, and case-based support and tracking.
Technical: Java based front-ends with back-end access via JSP. Data hosted on SQL Server 2000 and MySQL
4.0 using dedicated and redundant database servers.
 Achieved 100% availability of production OLTP databases for 24 / 6.98 (half hour weekend maint window)
 Gathered specifications, documented, developed, deployed, and maintained log shipping data replication,
including backup and point-in-time recovery components
 Developed, documented, tested database failover procedures in dual primary/failover server architecture
 Implemented, scheduled, and maintained database maintenance plans, including custom index defrag
processes so as to ensure optimal performance of production database
 Trained and assisted two junior DBAs and three System Operations personnel on how to manage and
troubleshoot database processes as well as how to use proprietary reporting architecture I implemented
 Collaborated with Broker Operations team, Product Management team, Applications Development team,
Quality Assurance team, and System Operations team on opportunities relating to growth of firm’s offerings
and operational efficiency; responsibilities included gathering specifications and documenting
 Collaborated with Broker Operations team and Compliance team on fulfilling data delivery and automated
reporting needs pertaining to specific client requirements gathered and documented by me
 Collaborated with Senior Technical Architect to establish high fault tolerant storage array network (SAN)
infrastructure and implemented with lead Technology Infrastructure resources
DAN STORM

 Collaborated with CFO and his finance team on fulfilling automated reporting pertaining to client billing,
internal financial reporting, and revenue reporting to parent company (Interactive Brokers); responsibilities
included gathering specifications and documenting
 Documented and provided detailed justification for migration from SQL Server 2000 to SQL Server 2005
 Documented detailed business justification for migration of MySQL databases into SQL Server core
 Build-out of T-SQL based reporting infrastructure which operates independently of OLTP database servers,
hence ensuring high availability of critical OLTP processing resources while distributing 150+ daily reports to
clients, trading partners, internal compliance, and trading desk using PGP encryption, FTP posting, and auto
e-mails; same proprietary infrastructure used for weekly, monthly, and quarterly reporting; resulted in at least
500% increase in rate that new reports are developed and deployed
 Implemented step-by-step intra stored proc logging so as to isolate and performance tune costly queries
 Interfaced with vendors providing solutions for reporting, backups, and SAN
 Ensured all required documentation procedures were followed as per best business practices

Independent Contractor / Consultant


7/05 - 3/08 Client: Sprint / Nextel
Industry: Communications
Project: Asset management, optimization, and performance reporting of large regional inventory of cellular
transceivers and supporting equipment
Technical: The client (i.e., pre-merger Nextel) required a web-based solution to help their engineers and
technicians monitor and manage their assets. An ASP-based front-end with a SQL Server back-end
served as the core technologies of this real-time / batch processing hybrid solution. I developed and
deployed the latter completely. Specifically, I developed ETL processes using Visual Basic .NET
and SQL Server 2000’s DTS to pull disparate raw data from multiple locations within the client’s
WAN. Then I prepped the data for SQL load using VB executables. Then using DTS, I loaded the
raw data into SQL Server, and normalized and processed the data using highly optimized stored
procedures which I also designed and developed. Daily ETL status, suspect raw data, and abend
reporting was completely automated by me using SQL Server’s e-mail capabilities. The bulk of the
computations are done within SQL Server on a single production support server and shipped, using
ActiveX VB scripting within DTS, to a production hybrid database/web server used exclusively by
end-users via the front-end. This dual-server setup for the production environment allowed the
ability to process large amounts of data on an isolated server without undermining the front-end
reliability required by end-users. The result of the Field Support database I developed and deployed
is a mission critical 100+ million row database used daily by 200+ engineers, technicians,
supervisors, and area managers. The success of my solution along with the aforementioned
Configuration Engineering module, has resulted in a high-level mandate that it be rolled out not only
to the other three regions in Nextel but also to all four regions in Sprint.
 Developed and optimized processes to collect and present time series based cell site performance and alarms
statistics resulting in timely field servicing and a 25% reduction in region’s dropped calls
 Coordinated with resources in other geographical offices to set up new data extraction and staging processes
 Provided business justification for acquisition of additional data processing hardware infrastructure
 Facilitated development of failover database servers
 Collaborated with two other SQL Server DBAs, one ASP web developer, and one project manager / business
analyst on this high visibility project

9/04 - 5/05 Client: CalPERS


Industry: Quantitative Investment Management
Project: Factor data generation for client’s internal actively managed $1 billion quantitative portfolio
Technical: The client had a subscription to the same Quantitative Analytics, Inc. SQL Server data warehouse
mentioned below (see 1/03 - 2/04 engagement). During this client engagement, I performed all
database development and administration functions in order to support three C++ developers so that
they could receive their financial data as quickly as possible. I used the same performance tuning
skills attained during the Quantitative Analytics, Inc. engagement listed below. During this client
project, it was discovered that Quantitative Analytics, Inc. did not map their many-to-many
relationship on a historical basis. To resolve this, I created and deployed an entirely new process that
would map disparate vendor data across time and created several vendor-to-vendor mapping tables
with effective date ranges. I went further by mapping these separate mapping tables into a single star
schema aggregate vendor mapping table with effective date ranges. This project involved a testing
DAN STORM

database server located in Chicago, IL and a testing and production database server located in
Newport Beach, CA. It was my responsibility to synch the separate database servers. I
accomplished this first by employing SQL Server’s backup/recover functionality and an FTP server
and then later employed a customized Excel VBA tool I developed which created DDL scripts for
tables, view, stored procedures, and UDFs. I also performed a series of database I/O bottleneck
analysis and overall processing bottleneck analysis so as to ensure that we were allocating resources
to the correct problems. As a result of this, I was able to create efficient staging tables where they
were most needed to enhance the speed of the overall end-to-end processing. Also incorporated into
this engagement was an optimized cluster computing architecture which allowed us to take
advantage of concurrent workstation computing while minimizing the processing load on the
database servers. I also implemented a nightly database backup plan and set up failover database
servers in order to alleviate disaster and recovery risks.
 Developed and optimized look-up stored procedures, functions, and views in SQL Server 2000’s T-SQL to
extract data from IDC Pricing, Compustat, IBES, Lancer, S&P, Barra, ITG, and Advent via a 1.5 billion row
research database
 Developed and optimized nightly batch processes using SQL Server Agent and DTS to create secondary
datasets containing staging data for query lookup, historical cross-vendor identifier mapping linkages, and
domestic trading calendars
 Assisted with design and solely implemented distributed client/server cluster computing architecture in order
to generate a growing factor results database repository, presently with 1.5 billion rows of factor data
 Facilitated development meetings
 Procured and implemented expansionary fibre attached RAID enclosures for production database server

6/04 - 8/04 Client: Coast Asset Management - Santa Monica, CA


Industry: Investment Management, Fixed Income
Project: Integrated systems implementation for trade desk, accounting, and marketing
 Coordinated on behalf of the trade desk and accounting groups all enhancements and bug fixes with
Integrated Business Systems, Inc., pertaining to their Visual Portfolio Manager (VPM) integrated solution
 Negotiated and secured contract for Financial Interactive’s FundRunner and planned and scheduled
implementation, data conversion, and training for the investor relationship management solution
 Negotiated pricing for iLumen Assentor solution for SEC 17a-4 and NASD 3010 & 3011 compliance for
broker-dealer group

1/03 - 2/04 Client: Quantitative Analytics, Inc. - Chicago, IL


Industry: Investment Software and Research Data Warehousing
Project: Design, development, and deployment of the firm’s new internet-based product, WebQA, which is a
collection of tools similar to that provided by the firm’s primary competitor (FactSet) but extracting
from the firm’s MarketQA database instead
Technical: The client had an integrated SQL Server research data warehouse on which the quickest possible
queries had to be constructed. There were 1.5 billion rows spanned across 300 tables in the data
warehouse. My primary role was to construct these quick queries within stored procedures which
would eventually be called upon by client applets constructed in C++. In this capacity, I relied
heavily on Execution Plan analysis, indexes, and a great deal of trial-and-error, particularly in the
area of join analysis and where criteria analysis. To enhance query performance, I constructed
secondary tables that removed the need of expensive joins. I also created and maintained a library of
UDF functions for the purpose of keeping T-SQL maintenance to a minimum.
 Prioritized, developed and optimized look-up stored procedures, functions, and views in SQL Server 2000’s
T-SQL to extract data from IDC Pricing, Compustat, IBES, First Call, S&P GICS Direct, MSCI, and
Worldscope vendor data contained in MarketQA’s normalized SQL Server tables
 Developed and optimized nightly batch processes using SQL Server Agent, DTS, and stored procedures to
create secondary datasets containing currency conversion rates, cross-vendor identifier linkages, and domestic
and international trading calendars
DAN STORM

9/99 - 1/03 Client: Fuller & Thaler Asset Management - San Mateo, CA
Industry: Investment Management, Equities
Project: Business processes planning and automation of enterprise database centric solutions, including a
research data warehouse which provides all source data for client’s newest Mid-Cap Core and Large-
Cap Core quantitative products
Technical: I gathered specifications then designed and executed a plan to centralize the client’s trading,
operational, and research data into a centralized SQL Server database platform. The resultant
portfolio accounting and research databases were constructed from scratch entirely by me. The
research database was the most challenging because it incorporated millions of rows from disparate
data vendors which had to be integrated. The integration was done using a series of many-to-many
mapping tables that had effective dates and market cap ranking information incorporated into them.
Because some of the data was revised daily, an ETL process using SQL Server’s DTS solution and
managed by SQL Server Agent was constructed which processed the incoming and the existing data
in a staging area and then migrated the finished tables into production. The entire server-to-server
migration was done with Excel VBA for ease of maintenance by the client. Also constructed in
VBA was a table count audit solution that ensured that the staging and production databases were
never out of step. I also implemented a nightly database backup plan and set up failover database
servers in order to alleviate disaster and recovery risks.
 Recruited and led as many as five sub-contractors specializing in SQL Server, Visual Basic, and MatLab in
providing real-time solutions for data extract, portfolio formulation and optimization, and reporting processes
 Gathered specifications, documented, designed, constructed, enforced data integrity, and standardized update
procedures of proprietary research data warehouse which integrates 125 million rows of historical and current
data:
 Security data (CRSP, IDC via FactSet, Compustat/Xpressfeed, I/B/E/S, Washington Insider, Lancer
Analytics, MSCI)
 Index data (S&P, Russell, MSCI)
 Proprietary data (portfolio holdings, cross-vendor linkages, covariance matrices, derived research data)
 Trained end-users (quantitative researchers, portfolio managers, traders, marketing, and IT personnel) on how
to access warehouse data via ODBC compliant applications, such as Excel, Visual Basic, Access front-end,
MatLab, and S-Plus
 Specified, procured, and implemented Dell database server with multiple fibre attached RAID enclosures for
a total of 1.5 terabyte of storage
 Gathered specifications and then advised on and prototyped distributed grid computing architecture for large
set statistical analysis
 Sole liaison and implementer of FactSet applications and data flow processes, including automated nightly
extract and upload of portfolio holdings for FactSet’s portfolio attribution solution
 Implemented evaluation versions of various portfolio optimizer applications
 Gathered specifications then constructed automated reporting of marketing time-series information relating to
product and client type categorization in order to meet general RFP fulfillment needs
 Gathered specifications, designed, developed, and implemented automated tiered schedule fee statement
generation process
 Proof-of-concept tested Bloomberg API calls via Excel VBA as viable alternative to FactSet
 Gathered specifications, documented, designed, developed, and implemented automated quarterly client
reporting application encapsulating information on account specific portfolio performance, strategy specific
textual content, and various portfolio accounting information extracted from client’s Advent AXYS portfolio
management application
DAN STORM

Accenture (formally Andersen Consulting)


Technology Consultant
3/97 - 8/97 Client: Natural Gas Clearinghouse - Houston, TX
Industry: Energy
Project: On-site implementation and customization of an Andersen Consulting accounting system which
allows client to better manage gas plant related production and sales data and to expedite flow of
royalty check disbursement and state level tax reporting
 Team lead for three other programmers assigned with responsibility of completing and implementing state tax
reporting modules
 Executed and monitored key disbursements and tax reporting modules on monthly basis
 Designed and deployed royalty check writing processes as per client specific needs
2/97 Client: Deluxe - Minneapolis, MN
Industry: Consumer Products
Project: Business Process Reengineering (BPR) engagement entailing rehosting of fundamental sales and
distribution modules into respective SAP modules
 Installed and deployed Andersen Consulting’s metadata mapping tool within client’s Oracle/UNIX
environment
 Documented standards and procedures for extracting mainframe metadata and for loading of metadata
repository to be used for mapping
2/97 Client: SKYChef - Arlington, TX
Industry: Consumer Products
Project: Outsourced payroll engagement
 Instructed IS personnel on UNIX Kornshell commands and general UNIX usage
 Designed and developed a Visual Basic module which gathered variant W-2 information and created W-2c
forms for federal compliant tax reporting procedures
6/96 - 1/97 Client: LTV Steel - Cleveland, OH
Industry: Industrial Products
Project: Large-scale rehosting of enterprise-wide legacy systems to SAP’s consolidated three-tier
client/server platform
 Developed and provided feedback on mainframe program components during product test of mainframe-to-
SAP interface construction methodology employing client site installed Lotus Notes based repository
 Planned and implemented performance testing scenarios for proof of concept
 Facilitated as liaison between Information Builders (IB) technical representatives and key UNIX, SAP, and
Oracle technical resources at client site for accelerated implementation of IB installed components on UNIX
servers and NT workstations
 Guided other analysts with front-end modifications of the metadata mapping ennoblement tool
5/96 - 6/96 Client: LTV Steel - Cleveland, OH
Industry: Industrial Products
Project: Finishing Data Acquisition (FDA) -- custom coding of a shop floor sub-system which will track raw
steel coils to finished product ready for storage and/or customer delivery
 Documented PVCS Version Manager procedures for developers, testers, and administrators
 Analyzed, evaluated, and implemented PVCS Version Manger source control package
 Assisted with evaluation and documentation of mainframe-UNIX messaging and file transfer architectures

5/95 - 4/96 Client: Mitchell Energy/Andersen Consulting Energy Solutions Group - Dallas, TX
Industry: Energy
Project: Full-scale migration and rehosting of an upstream energy revenue and cost allocation system,
Volume Management System (VMS), from a COBOL mainframe environment to a client/server
environment with the flexibility of either operating alone or as an integrated solution with another
industry-specific Andersen Consulting sub-system, The Intelligent Plant System (TIPS)
 Developed and implemented screen/privileges user-access
 Planned and initiated technical training for new programmers
 Established and enforced programming standards and naming conventions
 Spearheaded GUI on-line and client/server architecture development effort; provided technical support for
three programmer/analysts
 Provided preliminary and on-site technical support for presentations to prospective clients
DAN STORM

 Designed, developed, and deployed architectural blocks utilizing traditional programming methods and
object-oriented programming methods
 Developed and implemented on-line help

Ernst & Young


Information Technology Management Consultant
9/94 - 5/95 Client: PageNet - Plano, TX
Industry: Communications
Project: Provided large-scale support for MCI’s initiative into paging services: The system was designed
with an inherent flexible structure so as to support future initiatives involving additional large-scale
resellers
 Solely responsible for COBOL data interface modules that first, uploaded MCI’s Friends and Family raw
database records (averaging 5,000 records per day) from 9-track tape and second, formatted the data for load
into client’s Informix order entry (OE) and customer information system (CIS) database
 Designed, developed, and implemented entire shipping support system and shipping process interfacing
client’s OE-CIS and FedEx Powership stations leading to prompt delivery of 5000 pagers per day;
interviewed inventory personnel
 Designed, developed, and implemented national pager phone number assignment module utilizing complex
databases from Center for Communications for Management Information (CCMI)
 Designed, developed, and implemented electronic invoicing module; interviewed billing personnel

United States Marine Corps - Quantico, VA


9/90 - 8/92 Testing Noncommissioned Officer (NCO)
Industry: Military Training
Project: Implementation of a vendor’s client/server training support system that had lacked some desired
functionality
 Developed and deployed adjunct reporting modules to allow timely review of student performance for 3,600
Marine Lieutenants and Warrant Officers on an individual basis and with respect to the rest of the class for
the purpose of occupational assignment and for day-to-day class management operations
 Received a Navy Commendation Medal for having employed initiative in solving base-wide process problems
with technical solutions
12/88 - 9/90 Shipping and Receiving NCO
 Supervised receiving and distribution of equipment and parts; $2.5 million total value.
Supply Warehouse Clerk
 General warehouse daily task in support of base services operations

PUBLIC SPEAKING

4/14/04 Chicago Quantitative Alliance (www.CQA.org) Spring Conference - Las Vegas, NV


Grid Computing: Theory and Application
 Presented to audience of approximately 100 investment managers the concept of grid computing and its
growing usage as a tool to distribute and process very large amounts of concurrent computations
 Demonstrated to audience how to construct and manage their own on-site grid computing solution using
readily available and inexpensive software and hardware components

3/29/01 The University of Texas at Austin - Austin, TX


MIS and Investment Management
 Presented how disparate research data can be solved with a centralized data warehouse repository
 Illustrated the quantitative and compliance complexities involved in the process of reporting investment
results to clients
DAN STORM

TECHNICAL EXPERIENCE

· SQL Server 2000 to 2016 · Visual Basic 6.0 & .Net · MatLab
· Idera Diagnostic Manager · Visual C++ · S-Plus
· Oracle · PowerBuilder · MS Office
· MS Access · COBOL · MS Project
· MS FoxPro · Pascal · MS Visio
· Crystal Reports · SourceSafe · UNIX
· QuickBooks · PVCS · Kornshell
· Quantal Optimizer · Windows Server 2008 R2 · Perl
· Advent AXYS · FundRunner · VMS

ACTIVITIES
 Conceived and assisted with the development of a University of Texas MIS $32,000 annual scholarship
 Marine Corps Marathon (D.C.) finisher, 1991, 1990, 1989
 Certified advanced SCUBA diver, National Association of Underwater Instructors
 Honorable Discharge, USMC, Rank of Sergeant
 Expert Rifle Badge (three consecutive years)
 Navy Commendation Medal
 Three Letters of Appreciation
 Extensively traveled throughout Western Europe, Middle East, and Far East

You might also like