Senior SQL Server DBA: AN Torm
Senior SQL Server DBA: AN Torm
Senior SQL Server DBA: AN Torm
EDUCATION
EXPERIENCE
Developed backup and restore stored procs to automatically refresh DEV and QA environments on as-needed
basis with PROD data and then redact any sensitive information (e.g., passwords, SSNs, credit cards, etc.)
Designed, developed, and implemented Product Search caching using rolled-up tables with de-normalized
look-up product information frequently searched by customers thus releasing significant I/O and CPU
resources
Refined and troubleshot hourly order flow DBMail reporting
Identified, escalated, and advised on non-SARGable aspects of resource-intensive queries previously
deployed by developers leading to month-end peak stoppages
Implemented weekly processes to either groom or move old data into archive databases in order to keep
OLTP databases lean thus allowing for faster CRUD operations as well as saving nightly backup storages
Implemented previously developed library of stored procs to automate nightly backups, transaction log
backups, index defrags and rebuilds, statistics updates, integrity checks, and database shrinks on all instances
Developed and implemented audit trail triggers to identify mysterious updates to SKU Item Master tables
Managed Idera diagnostic solution and mined perfmon repository for information pertaining to HDD burn
rates, CPU% utilization trends, slow-running queries, I/O levels, deadlocks, table blocks, wait statistics, and
various other system state information.
As the firm’s sole Document Management System (DMS) Administrator, upgraded and managed the multi-
office, multi-nation server and application infrastructure which hosts the firm’s Tier 1 repository solution for
legal documents (WIP, closed, and archived cases). This responsibility became mine in addition to my other
Senior DBA responsibilities after the prior DMS Admin was summarily released in the middle of several on-
going critical projects.
As the firm’s sole DMS Admin, planned and deployed the disaster recovery solution using InMage replication
between Phoenix and Los Angeles. Also planned and deployed same-site server redundancy using Microsoft
clustering services. All work was done remotely from Los Angeles via collaboration with Phoenix technical
resources, thus negating the need for travel expenses. My lead role in architecting and implementing multiple
levels of server redundancy has since then resulted in 100% availability of this work product generation
system.
As lead technical liaison, collaborated with business development team to improve on existing LexisNexis
InterAction CRM platform in an effort to integrate internal data and leverage existing client base for greater
revenue
Designed, implemented, deployed Compuware ServerVantage e-mail alerting and dashboard based time-
series monitoring components, thereby reducing daily database incidents from two about occurrences per day
to no occurrences per day
Assembled and spun-up HP Proliant DL385 servers as fully operational file and database servers using a
combination of local RAID and iSCSI network SAN storage while collaborating with the firm’s Network
Infrastructure Architect on matters relating to optimized network throughput and storage deployment
Worked with the firm’s Litigation Support application administers as well as third-party application
developers and engineers to address and fulfill end-users’ business requirements; this includes reviewing and
deploying upgrade scripts and components on the database servers
Responsible for data access and network security relating to the database environment
Collaborated with web developers to identify and optimize problematic database queries
Worked with “lateral” attorneys and remote staff personal to import case matter items from attorneys’ prior
firms into our own Interwoven DeskSite document management system.
Collaborated with CFO and his finance team on fulfilling automated reporting pertaining to client billing,
internal financial reporting, and revenue reporting to parent company (Interactive Brokers); responsibilities
included gathering specifications and documenting
Documented and provided detailed justification for migration from SQL Server 2000 to SQL Server 2005
Documented detailed business justification for migration of MySQL databases into SQL Server core
Build-out of T-SQL based reporting infrastructure which operates independently of OLTP database servers,
hence ensuring high availability of critical OLTP processing resources while distributing 150+ daily reports to
clients, trading partners, internal compliance, and trading desk using PGP encryption, FTP posting, and auto
e-mails; same proprietary infrastructure used for weekly, monthly, and quarterly reporting; resulted in at least
500% increase in rate that new reports are developed and deployed
Implemented step-by-step intra stored proc logging so as to isolate and performance tune costly queries
Interfaced with vendors providing solutions for reporting, backups, and SAN
Ensured all required documentation procedures were followed as per best business practices
database server located in Chicago, IL and a testing and production database server located in
Newport Beach, CA. It was my responsibility to synch the separate database servers. I
accomplished this first by employing SQL Server’s backup/recover functionality and an FTP server
and then later employed a customized Excel VBA tool I developed which created DDL scripts for
tables, view, stored procedures, and UDFs. I also performed a series of database I/O bottleneck
analysis and overall processing bottleneck analysis so as to ensure that we were allocating resources
to the correct problems. As a result of this, I was able to create efficient staging tables where they
were most needed to enhance the speed of the overall end-to-end processing. Also incorporated into
this engagement was an optimized cluster computing architecture which allowed us to take
advantage of concurrent workstation computing while minimizing the processing load on the
database servers. I also implemented a nightly database backup plan and set up failover database
servers in order to alleviate disaster and recovery risks.
Developed and optimized look-up stored procedures, functions, and views in SQL Server 2000’s T-SQL to
extract data from IDC Pricing, Compustat, IBES, Lancer, S&P, Barra, ITG, and Advent via a 1.5 billion row
research database
Developed and optimized nightly batch processes using SQL Server Agent and DTS to create secondary
datasets containing staging data for query lookup, historical cross-vendor identifier mapping linkages, and
domestic trading calendars
Assisted with design and solely implemented distributed client/server cluster computing architecture in order
to generate a growing factor results database repository, presently with 1.5 billion rows of factor data
Facilitated development meetings
Procured and implemented expansionary fibre attached RAID enclosures for production database server
9/99 - 1/03 Client: Fuller & Thaler Asset Management - San Mateo, CA
Industry: Investment Management, Equities
Project: Business processes planning and automation of enterprise database centric solutions, including a
research data warehouse which provides all source data for client’s newest Mid-Cap Core and Large-
Cap Core quantitative products
Technical: I gathered specifications then designed and executed a plan to centralize the client’s trading,
operational, and research data into a centralized SQL Server database platform. The resultant
portfolio accounting and research databases were constructed from scratch entirely by me. The
research database was the most challenging because it incorporated millions of rows from disparate
data vendors which had to be integrated. The integration was done using a series of many-to-many
mapping tables that had effective dates and market cap ranking information incorporated into them.
Because some of the data was revised daily, an ETL process using SQL Server’s DTS solution and
managed by SQL Server Agent was constructed which processed the incoming and the existing data
in a staging area and then migrated the finished tables into production. The entire server-to-server
migration was done with Excel VBA for ease of maintenance by the client. Also constructed in
VBA was a table count audit solution that ensured that the staging and production databases were
never out of step. I also implemented a nightly database backup plan and set up failover database
servers in order to alleviate disaster and recovery risks.
Recruited and led as many as five sub-contractors specializing in SQL Server, Visual Basic, and MatLab in
providing real-time solutions for data extract, portfolio formulation and optimization, and reporting processes
Gathered specifications, documented, designed, constructed, enforced data integrity, and standardized update
procedures of proprietary research data warehouse which integrates 125 million rows of historical and current
data:
Security data (CRSP, IDC via FactSet, Compustat/Xpressfeed, I/B/E/S, Washington Insider, Lancer
Analytics, MSCI)
Index data (S&P, Russell, MSCI)
Proprietary data (portfolio holdings, cross-vendor linkages, covariance matrices, derived research data)
Trained end-users (quantitative researchers, portfolio managers, traders, marketing, and IT personnel) on how
to access warehouse data via ODBC compliant applications, such as Excel, Visual Basic, Access front-end,
MatLab, and S-Plus
Specified, procured, and implemented Dell database server with multiple fibre attached RAID enclosures for
a total of 1.5 terabyte of storage
Gathered specifications and then advised on and prototyped distributed grid computing architecture for large
set statistical analysis
Sole liaison and implementer of FactSet applications and data flow processes, including automated nightly
extract and upload of portfolio holdings for FactSet’s portfolio attribution solution
Implemented evaluation versions of various portfolio optimizer applications
Gathered specifications then constructed automated reporting of marketing time-series information relating to
product and client type categorization in order to meet general RFP fulfillment needs
Gathered specifications, designed, developed, and implemented automated tiered schedule fee statement
generation process
Proof-of-concept tested Bloomberg API calls via Excel VBA as viable alternative to FactSet
Gathered specifications, documented, designed, developed, and implemented automated quarterly client
reporting application encapsulating information on account specific portfolio performance, strategy specific
textual content, and various portfolio accounting information extracted from client’s Advent AXYS portfolio
management application
DAN STORM
5/95 - 4/96 Client: Mitchell Energy/Andersen Consulting Energy Solutions Group - Dallas, TX
Industry: Energy
Project: Full-scale migration and rehosting of an upstream energy revenue and cost allocation system,
Volume Management System (VMS), from a COBOL mainframe environment to a client/server
environment with the flexibility of either operating alone or as an integrated solution with another
industry-specific Andersen Consulting sub-system, The Intelligent Plant System (TIPS)
Developed and implemented screen/privileges user-access
Planned and initiated technical training for new programmers
Established and enforced programming standards and naming conventions
Spearheaded GUI on-line and client/server architecture development effort; provided technical support for
three programmer/analysts
Provided preliminary and on-site technical support for presentations to prospective clients
DAN STORM
Designed, developed, and deployed architectural blocks utilizing traditional programming methods and
object-oriented programming methods
Developed and implemented on-line help
PUBLIC SPEAKING
TECHNICAL EXPERIENCE
· SQL Server 2000 to 2016 · Visual Basic 6.0 & .Net · MatLab
· Idera Diagnostic Manager · Visual C++ · S-Plus
· Oracle · PowerBuilder · MS Office
· MS Access · COBOL · MS Project
· MS FoxPro · Pascal · MS Visio
· Crystal Reports · SourceSafe · UNIX
· QuickBooks · PVCS · Kornshell
· Quantal Optimizer · Windows Server 2008 R2 · Perl
· Advent AXYS · FundRunner · VMS
ACTIVITIES
Conceived and assisted with the development of a University of Texas MIS $32,000 annual scholarship
Marine Corps Marathon (D.C.) finisher, 1991, 1990, 1989
Certified advanced SCUBA diver, National Association of Underwater Instructors
Honorable Discharge, USMC, Rank of Sergeant
Expert Rifle Badge (three consecutive years)
Navy Commendation Medal
Three Letters of Appreciation
Extensively traveled throughout Western Europe, Middle East, and Far East