+1(917) 837-5163 | 116 Pershing Ave Ridgewood NJ 07450 |
Strategic and knowledgeable quantitative software engineer with more than 20 years of experience as part of a Statistical Arbitrage desk. Deep understanding of all aspects of technology, hardware and software, financial data modeling, data acquisition and manipulation, trade execution, direct feeds, market micro-structure, pre- and post-trade analytics, historical simulations and back-testing. Traded various asset classes (Credit Default Swap indices, FX forwards, index and commodity futures, index and equity swaps) for US, EU and Asian markets. Programmed many parts of the software that powered all aspects of a 4 person quantitative alpha-generating desk that traded a significant portion of the US and European equity markets. Extensive experience with k3/Kdb (one of the first licenses) and q/Kdb+ (one of the earliest adopters). Fastest solution among all kdb+ programmers for KxCon 2016 challenge at Montauk, NY. Instrumental in early adoption of many key initiatives – kdb, x86-64, machine learning and deep learning, Big data and GPUs. Detail oriented, highly motivated, problem solver with strong technical background. US Citizen. Can-do attitude.
•Python, C, C++, Matlab, SAS
•q/KDB+, large historical and real time databases
•Cluster, grid computing and cloud (AWS, Azure)
•CUDA, cuDNN, Numba, TensorFlow,Keras, PyTorch
•NumPy, Pandas, PyArrow, Scikit-Learn,
•Linux x86-64, Linux/ARM, Solaris, Windows, macOS
Consultant/interim Senior Vice President for a large Sovereign Wealth Fund in the Middle East | 08/2021 - Now |
Head of the group responsible for:
•designing a robust scalable research infrastructure
•automating processing of proprietary data construction
•automating processes for data downloading, cleaning and processing
•developing tools and processes used in risk, attribution, exposure, liquidity,
transaction cost reporting and analytics
•architecture, design and support of the automated investment data platform
•design of the middleware layer between quantitative researches, data lake and
other internal and external data sources
Consultant/Financial technology and data architect | 10/2020 - 07/2021 |
•Planned and executed transformation of trading and technology infrastructure for the firm.
•Significantly improved backtest capability for Equity, Fixed Income and Futures instruments.
•Implemented significant improvements to compliance systems and processes.
•Designed and implemented infrastructure for “tic by tic” exchange data for equities and futures.
•Redesigned firm’s trading, development and research infrastructure in the cloud (AWS).
•Designed and implemented changes data and analytics pipelines that were required by external vendors. These changes allowed firm to continue sourcing external data and analytics while taking advantage of substantial cost savings.
Head of Information Technology and Head Trader | 08/2019 – 10/2020 |
•Went from an empty office with limited electrical setup to a running office with hybrid “on prem”/cloud (Azure) infrastructure and full electronic trading platform for various asset classes and multiple brokers.
•Planned, coordinated and executed “setup from scratch”: server room and server hardware, network, fiber optic lines, cloud setup , proprietary software and Order Management System.
•Designed, maintained and executed most of the software around the proprietary Global Macro strategy - from data acquisition to trade and “post trade” data and executions.
•Executed trades for
✔CDX HY ($USD 2+ billion notional), CDX NA IG, CDX ITRX EU using Bloomberg SEF
✔FX forwards using Refinitiv (Thomson Reuters) FXall electronic platform
✔Index and Equity Synthetics via Fix/OMS for US, EU, Asia
✔Index and Commodity Futures via Fix/OMS for US, EU, Asia
✔Short term US Government Treasuries OTC
•Designed and executed many custom pieces of software that connected several “in house” and outside systems together.
Executive Director, Global Macro Strategies group | 03/2018 – 08/2019 |
•Part of the Global Macro Strategies group that later divested into separate fund, while maintaining all the assets.
•Planned and executed complete redesign of software for our proprietary desk that allowed faster and more efficient data acquisition, data digestion and trade generation, trade execution and post trade processes. This allowed almost seamless six-fold growth in assets while maintaining better efficiency, faster run-time while maximizing precision and accuracy.
•Directly worked with several vendors on alternative data sources.
•Planned and executed transition from proprietary vendors to in house custom solutions to significantly cut data acquisition expenses and processing time while improving accuracy.
Quantitative Software Engineer, Q4 and QuantOne groups | 01/2012 – 01/2018 |
•Designed and wrote most of the software for desk’s “Data Lake” (q4). All incoming raw data was pre-processed, analyzed and written with different levels of granularity and precision for further consumption by the models and backtests.
•Low latency market data feeds using Redline InRush feeds.
•Low latency execution using Redline Order Execution Gateway.
•Market/fill simulator and backtest engines (several types) with collected order book (level 2), level 1 and aggregated data. Backtest engine allowed execution of 3000+ stock global multi exchange/multi currency portfolio with intraday data with run time of 1 minute for 1 year of intraday orders. This allowed to cut run-time for “10 year backtest” from 3 days to 10 minutes
•Wrote 64 bit Python API for Bloomberg, before Bloomberg released own python API.
•expanded use of Bloomberg as primary/secondary/tertiary data source.
•Real time and historical collection of pricing, fundamental, reference, analyst estimates, guidance and news data from Bloomberg, Reuters and Dow Jones
•Deep learning pipeline using TensorFlow and GPUs
•Analyzed alternative datasets – twitter sentiment, credit card spending and location data
•Designed trade execution front end that allowed real time monitoring of trade performance, risk and cost analysis.
•Designed a custom front end for research, data analysis, visualization and manipulation in kdb.
•Contributed on the development of the stock specific active/passive fill model.
•Improved volume prediction models to account for intraday characteristics and dynamically changing market conditions.
•Analyzed real time and historical market data for all major equity markets across the world – NYSE, NASDAQ, TSX, LSE, Euronext, Xetra, BME, Six Swiss, Nordic, Oslo, Brazil, Japan, Korea.
Quantitative Software Engineer, Winkler/Weingarten Quant group | 07/2006 – 12/2011 |
•Large historical and real-time data analysis with many applications servers in kdb and kdb+
•Intraday data capture using kdb/kdb+ and direct to exchange and Reuters feeds (UTP, CTA, OPRA, CME, LSE, Euronext, Xetra, Milan, Six Swiss).
•Variety of historical and real-time data sources - MarketQA,FastTick, Bloomberg, Reuters, index composition, historical products, earning estimates, cross-reference and corporate actions.
•designed and maintained system that used hosted, broker and “in house” algos to execute trades for US and EU markets. System supported automatic fail-over, real time analysis of issues, trading costs and risk and allowed trading with little or no human intervention.
Assistant Vice President, Equity Analytics Technology group | 05/2004 – 07/2006 |
•Designed and implemented back-end engine for Lehman Brothers Multibroker TradeGauge - global post-trade analytics tool - competitor for ITG's Transaction Cost Analysis. One of the Lehman's most sought after initiatives. This tool helped Lehman Brothers retain existing and gain many new clients for its LMX execution engine
•Large historical data analysis and benchmarking involving variety of applications servers in kdb and kdb+: Level 1 Equities Data, L1 Futures Data, L2 Equities Data (250 million ticks per day, >100billion total), L2 Futures Data, EOD options data.
•Intraday data capture and on the fly calculations using kdb/kdb+ and direct to exchange WOMBAT feeds (UTP, CTA, OPRA, CME).
•Extended LB Data Container Framework – middleware and abstraction layer on top of kdb and kdb+ applications servers provides fault tolerance and distributes the load across data centers.
•Variety of other historical and real-time data sources – MarketQA, FastTick, Bloomberg datafeed, index composition service, historical products, cross-reference and corporate actions service.
Software Engineer | 06/2002 – 05/2004 |
•Full development cycle for statistical arbitrage group.
•Real time US and international data feeds with Reuters/kdb (tickerplant).
•Proprietary historical TAQ database (daily trades and quotes 1993+ combined with prop data).
•One of the first Linux/x86-64 servers and software stacks on Wall St.
•Feeds for historical prices, corporate actions, earning estimates - Fame, Datastream, I/B/E/S
•designed processes for automated pre- and post-trade, liquidity, trading cost analysis
•instrumental in bringing k/kdb solutions to the firm that allowed substantial increase in speed, quantity and accuracy of research and trading platforms
Applications developer/Network Administrator | 09/2001 – 05/2002 |
•Developed, tested and documented custom web applications that allowed users secure remote access to information and email via Intranet and Internet.
•Web based portfolio performance tracking and analytics.
•Managed corporate network with mix of Windows/Exchange and Linux/Apache Sendmail/Samba servers
•Participated in investment ideas revolving around Russia, Ukraine, China, Indonesia and Korea and other international markets
Master of Science in Computer Science
Oklahoma City University, 2001. Full scholarship
Bachelor of Science in Business Administration
Oklahoma City University. 2000. Full scholarship. President's Honor Roll