Details of my career as a "Data Professional": Highligts, cool projects, videos and so on. The objective is to register, and to share, here what doen't fit into a "regular CV". At the end of the day, a picture is worth a thousand words. Like the image below, my collection of Conferences Badges. This CV has photos of some of these deliveries, as well as images and links for videos I recorded for Microsoft AI and Data Organizations.
The next image is interesting. I created this CV visualization, the first image below, for a "Data Scientist Role" presentation. I also used it into my How I became a Data Scientist, which had thousands of visualizations. This image was also useful to help me understand how much my career has changed, how I went from reactive to proactive in terms of adopting new technologies. I really like to be an early adopter for new technologies.
Don't you think that this kind of CV is much more efficient than a regular one?
Python SDK and Azure Synapse Link Program Manager. Retention analysis, customers feedbacks analysis, customers engagements, features planning, supportability, documentation, demos, conferences, webinars, support cases reviews, and content development.
Initially working with the LearnAI team, I was an AI Developer and Instructor. I created and delived the trainings, that were end-to-end AI Solutions using multiple products like Cognitive Services, Azure Machine Learning, CosmosDB, Bots, among others. All content included theory materials and thands-on labs. I delivered in-person and online trainings, including "train the trainers" courses and all of the biggests Microsoft conferences like Ready, Inspire, Ignite, and MLADS. I had the opportunity to teach AI in Redmond, Seattle, Bellevue, Bogota, Sao Paulo, London, Lisbon, Singapore, Sydney, Amsterdam, New York, Orlando, and Dallas.
In March 2019 we became ACE Team, AI Customer Engagements. We work on AI critical projects with ISVs, GSIs, and special customers. The job is to help critical AI projects, from trainings to MVPs development. The engagements can go from a few hours to months. At the end of the project, we share the lessons learned with the community trough conference sessions, videos, and blog posts.
Some of them are pretty simple, while others are very complete. The fact is, I don't like re-work or a messy data. So I have everything saved and organized.
- Python Custom Skills Toolkit
- PowerShell Tooklit
- Vertica Toolkit
- Hadoop Toolkit
- Oracle Toolkit
- MSSQL Toolkit
- Mine Knwoledge from Files names and paths using a Python Custom Skill--> Innovation!!!
- Using Azure-Functions-for-Python to create a Cognitive Search Filtering Custom Skill --> Innovation!!!
- Announcement: Knowledge Mining Solution Accelerator (KMA v1.0)
- Learn How to Mine Knowledge from Audio Files --> Innovation!!!
- Learn How to Create End-to-End Solutions with Microsoft AI
- Announcing the Python Custom Skills Toolkit --> Innovation!!!
- How to Organize your Data Lake
- How Bots Change your Data Architecture
- Polyglot Persistence with Azure Data Services
- Azure Cosmos DB for AI Engineers
- Azure Synapse Link Custom Partitioning
- Azure Synpase Link CDC from analytical store
- Azure Synapse Link for existing Containers
- Azure Synapse Link for Gremlin
- New Power BI connector for Azure Cosmos DB
- Azure Synapse Link Updates
- Binary Encoding for Azure Cosmos DB
- Fabric API for GQL for Azure Cosmos DB
- Cost Reduction with Azure Cosmos DB Reserved Capacity
I have been delivering sessions and trainings since Nov/2017, in the most important Microsoft conferences: MLADS, Intelligent Cloud Bootcamp, AI Airlifts, Inspire, Ignite, and Microsoft Ready.
What | Where, When | Details |
---|---|---|
![]() |
Redmond, November 2019 | Partners Cloud Bootcamp, delivering a Cognitive Services session |
Orlando, November 2019 | Microsoft Ignite 2019. Working in the Vision Services Booth | |
![]() |
Orlando, November 2019 | Delivering an AI session in Microsoft Ignite for 120 people |
![]() |
Bellevue, October 2019 | Impact recognition - Microsoft AI Newsletter |
Seattle, October 2019 | AI session for 300 people in the Intelligent Cloud Bootcamp | |
Bellevue, October 2019 | Presenting End-to-End AI Solutions to 700 people in the AI All Hands | |
Redmond, September 2019 | Certified Hackathon Leader, Gold Speaker, AI Engineer, and Trainer, among 10 other certifications | |
![]() |
Dallas, September 2019 | AI training for partners, GBBS, and CSAs |
![]() |
Redmond, August 2019 | Team Dashboard - 188 NSAT, almost 1000 in-person attendants |
Las Vegas, July 2019 | Microsoft Ready Speaker, delivered Knwoledge Mining workshop in the main event | |
Redmond, July 2019 | AI training for Microsoft Global Black Belts | |
Redmond, April 2019 | Content Moderator Webinar for Microsoft AI Inner Circle Program. You can see part of the recording here | |
Redmond, June 2019 | KMB added to Microsoft AI School. This training was also listed as an official Cognitive Search training at Azure.com | |
Redmond, April 2019 | Intelligent Cloud Bootcamp best session | |
![]() |
Redmond, March 2019 | Seattle Area AI MeetUP |
Amsterdam, March 2019 | LearnAI Airlift for 80+ attendents with 198/200 NSAT. Taught Vision API, Cognitive Services, Custom Vision, CosmosDb, Bots, and Azure Search | |
![]() |
Sydney, February 2019 | LearnAI Airlift for 18 attendents with 200/200 NSAT |
![]() |
Singapore, February 2019 | LearnAI Airlift for 50+ attendents with 200/200 NSAT |
![]() |
Seattle, Febrauary 2019 | Cognitive Search session in Microsoft Ready for 100+ attendants |
Redmond, January 2019 | New format that mixed pre-recorded (content) and live broadcast. You can see a small part of one video here | |
![]() |
Redmond, November 2018 | Cognitive Search session for MLADS - Microsoft's Machine Learning and Data Science Conference |
![]() |
Las Vegas, August 2018 | My Micrsoft Ready Session, 190 attendants |
![]() |
Redmond, July 2018 | Intelligent Bots session (8 hours) for Microsoft GBBs |
Redmond, June 2018 | KMB was my first of many commits into Azure GitHub repos. | |
Redmond, June 2018 | KMB launched at private preview time, with a long list of inovations like collaboration, princing, bot interface, Content Moderator custom skill, and alternative agendas | |
Redmond, May 2018 | Knwoledge Mining session for Microsoft Digital Ready | |
Redmond, May 2018 | Webinar broadcasted live to 1100+ atendees, with 185/200 NSAT. Now it is published into Azure Youtube Channel. Part 1: https://youtu.be/k5xScEyyI4M Part 2: https://youtu.be/Cf6UQSoL5mk Part 3: https://youtu.be/DM8LxXyiihg | |
Redmond, March 2018 | Channel 9 videos with Anna Thomas. You can see part 1 here and part 2 here | |
![]() |
Lisbon, February 2018 | Cognitive Services Bootcamp: Cognitive Services, CosmosDb, Azure Search, and Bots |
London, February 2018 | Delivered the Cognitive Services Bootcamp: Cognitive Services, CosmosDb, Azure Search, and Bots | |
![]() |
Bellevue, Februrary 2018 | Intelligent Cloud Bootcamp session, with Buck Woody, about AI and DevOps |
Some cool projects from my Azure CSA role, from Nov 2015 until Nov 2017. During this period, I worked with some of the largest companies in Brazil, helping their journeys to cloud computing. One of them has increased cloud consumption by dozens of times, becoming TOP 3 Azure customers in Brazil. IoT, AI, and Data Lake projects helped drive this adoption growth.
The client was installing the biggest Wind Farms of the country and they were looking for real time monitoring. Again I suggested the idea to cross validade the data with other data sources and we did a POC. It went very well and a vendor was hired to implement the solution. SAP data ingestion and weather forecast were a key diferentiators, as weel as the real time PBI dashboards.
The client was a giant company with industrial facilities distributed throughout the country. There were several parallel requirements such as: CIO needing to archive data from the CRM appliance, COO wanting to cross IoT data with transactional systems, CMO wanting to understand the behavior of customers.
I merged it all in a data lake project and did the first POC in the first week. It went very weel and we started to add data sources and advanced analytics with machine learning.
The competitors didn't have a chance and after 2 months I was presenting the solution and the results to the CIO. When I showed my findings about industrial vehicles, he connected on time with the COO who confirmed the problems, being impressed with the ability to find insights into the data. I could find problems that they were not aware of, creating great motion for the project.
The final data architecture is in the image below.
Certificates | Certificates |
---|---|
PSafe reached 30MM of MAU, all generating a gigantic amount of LOGs. Something like 1 TB a day. Hadoop was used by it was hard to understand and to use the data. I worked there before and returned to the company as a leader in the data area, to reorganize the data infrastructure. We created a data lake to make sure we were storing and analysing the correct data.
The project used Cloudera, including HIVE/PIG for batch processing and Impala for interactive queries. The project also included the data structure within HDFS: raw files, raw data, business data, BI data. Check the last image, a table, for more details. The images below are missing Impala and the parquet files, what we added in a second moment.
Today, Julhe of 2019, this structure is still used, practically unchanged.
Data Architecture 1 | Data Architecture 2 | Data Architecture 3 |
---|---|---|
I am very proud of this project. Using SQL Server 2008 RS, Microsoft Fast Track Reference Architecture, and very simple HW, I created a MPP DW for Lemon Bank, where I was working since 2002. I had to find a way to scale out, since the data volume was bigger than the total space of the biggest server available. A real Big Data problem, solved with lots of study and criativity.
Today, Feb/2020, this structure is still used, practically unchanged.
Interesting points of this project:
- ETL was SQL Scripts + bcp (DTS was not fast enought)
- A view was the central fact table. Is was an UNION of hundreds of tables, all of them in different database. Each database was in a different disk, sometimes it was a different server (linked server). All records had an ID. Check constraints helped the optimizer to avoid unnecessary table access.
- Self Service BI on top of this, with Excel and a dynamic query interface too.
- This project won The Brazilian Bank Industry Award (CIAB) in 2008.
MPP Database | ETL | Trophy |
---|---|---|
![]() |
||
My "handmande" distributed database | ETL Architecture | The project award |
I was contracted by EDS to help a bank with performance problems in their transactional database. All experts before me suggested HW upgrades, what did not fixed the problem. How could I succeed when HP, IBM, and Accenture consultants failed? I had to do somthing different and for me it was clear that the problem was in the application.
What I did different? I asked the client to see the code! After some resistence, I was allowed to audit the application and I found multiple problems. We fixed one by one and the deployment day was on 9/11/2011. For many reasons, an unforgettable day.
Last winter I created a LinkedIn articles series about my lessons learned as a DBA. It was called DBA TALES FROM THE CRYPT and this project was detailed here.
In 2007 I was really angry with the lack of organization within the IT department. As a DBA, I was affected because I could not organize my work, and I was managing the data area. In that year, a friend returned from France where he did a PhD and learned about SCRUM. Back to Brazil, he acquired remarkable knowledge and experience on agile methods, including Kanban. He ended up opening a training company and called me for a presentation. I immediately identified the value of all that and also began to study the subject.
It is not common to see the DBA introducing a project management methodology in a company. But that's exactly what happened!
The DBA team started to use Kanban in 2007, what helped not only our own work, but also to measure the impact of the others teams lack of methodology on us. I'm also a person with lots of connections and I helped my friend to sell one of his first in-company agile trainings. The client was a friend's software company that never heard about Agile before. Today, 2019, my Agile expert friend still have his consulting company, with dozens of employess and hundreds of clients. And my friend's software company is still using agile methods for all of its operations.
In 2008 I was working for this Fintech and the all company was challenged to create a product to increase revenew. It was not expected that someone from IT would give a suggestion to the business area. But I'm very creativity and I won it! I suggested a remuneration model for the employees of the affiliated networks. The screenshot below is a screenshot of the bank's intranet, with my photo illustrating the winning idea. The project was implementing and worked as expected