CODE 3-2022 Web
CODE 3-2022 Web
CODE 3-2022 Web
MAY
JUN
2022
codemag.com - THE LEADING INDEPENDENT DEVELOPER MAGAZINE - US $ 8.95 Can $ 11.95
Document
Databases
with Marten
Features
8 Blockchain 50 I mplementing Face Recognition
Sahil clarifies a few points about blockchain, bitcoin, and the future.
Sahil Malik
Using Deep Learning and
Support Vector Machines
14 reate Maintainable Minimal
C If you have a fancy new computer or phone, you might already be using
facial recognition. Wei-Meng explains how this exciting technology is
Web APIs at once simpler than you think and crazy complicated—and super cool!
Wei-Meng Lee
Paul shows you the benefits of a Router class and minimizing your
Web API calls.
Paul D. Sheriff 66 istributed Caching in ASP.NET 6
D
22 Change-Tracking Mixed-State Using Redis in Azure
Whether you realize it or not, you’ve been benefiting from caching
Graphs in EF Core for years. Joydip tells you how and why to use caching to your best
advantage.
Julie says that integration testing is the key to tracking changes in EF
Joydip Kanjilal
Core. Learn what you need to know to get up and running with it.
Julie Lerman
Projects
Cosmos is a document storage tool for database developers.
Departments
You’ll want to consider what Shawn reveals about this spiffy tool
before you build your next app.
Shawn Wildermuth
US subscriptions are US $29.99 for one year. Subscriptions outside the US pay $50.99 USD. Payments should be made in US dollars drawn on a US bank. American Express,
MasterCard, Visa, and Discover credit cards are accepted. Bill Me option is available only for US subscriptions. Back issues are available. For subscription information,
send e-mail to subscriptions@codemag.com or contact Customer Service at 832-717-4445 ext. 9.
Subscribe online at www.codemag.com
CODE Component Developer Magazine (ISSN # 1547-5166) is published bimonthly by EPS Software Corporation, 6605 Cypresswood Drive, Suite 425, Spring, TX 77379 U.S.A.
POSTMASTER: Send address changes to CODE Component Developer Magazine, 6605 Cypresswood Drive, Suite 425, Spring, TX 77379 U.S.A.
Artistic Collaboration
In my last editorial “The Computer is my Paintbrush,” I talked about the thrill I still get building applications
after 30+ years in this business. There are two parts of this process that bring me joy. The first part is the
process of creation. Taking an idea from thought to code to a working application is simply amazing
to me. The second thrilling part of this process work with lots of other developers. It was also really is. No good movie gets made without col-
is observing the use of these applications. Some- during this time that I got the conference speak- laboration. and I feel the same with software de-
times I’m the user, and in many more cases, oth- er bug. It was at a conference that I met another velopment.
ers receive the benefit from my applications. long-term collaborator and now best friend,
John Petersen. John and I met after I gave a ses- Collaboration can deliver real results and some-
That editorial was written from a singular point sion and we hit it off right away. times the best collaborations come from just a
of view: MY point of view. What I mean by this is single sentence or comment. For nearly two de-
that I discussed writing applications as a SOLO Soon after that conference, I started writing a cades, I served in the role of lead architect/en-
developer. What I failed to mention is that there’s Visual FoxPro book and I was trying to assemble gineer for a midwest credit card company. During
another style of work that can bring joy, and that a team of writers as I didn’t want to write a whole this time, I provided many of the tools we built
style of work involves working with others to build book by myself. I KNEW that collaboration would our applications with, including frameworks, li-
great things. What’s this other style of work? Col- be the only way to succeed in writing a book. braries, tools and documentation. I recall one
laboration! Let’s talk about collaboration. It didn’t take long before I pitched this idea to day when Dan Zerfas the VP of development (and
John to come on board to write a chapter or two. still my friend), suggested that I wrap all these
When I started my career, I was a “Lone Wolf” Long story short, John became a full-fledged disparate tools into a common shell. When he
coder. I was generally the solo programmer on contributor to the book and we’ve been working spoke those words, I recognized a flash of bril-
staff and responsible for everything code related. together in one capacity or another ever since. liance. I went to work building a tool called DPSI
It didn’t take me too long to realize that this was Shell. DPSI is the acronym of my company name.
a limiting factor in my progression as a devel- I should also mention that during my book writ- This tool is still in use today—over 15 years. All
oper. I soon began to seek out other developers. ing phase, I met Melanie Spiller (my kick-butt this from one statement from a valued collabora-
This was the era of CompuServe, so I took to some editor on this magazine) via another publisher. tor. The power of collaboration has never been
of the developer forums there to see how other That collaboration didn’t work out at the time (it more evident when I recall this story.
developers worked. This definitely scratched an was me, not Melanie, for the record), but it has
itch but was not totally fulfilling. definitely worked well over the last 10 years for If I can leave you with one thought: Keep your
sure. So sometimes collaboration takes time to ears, heart and mind open to collaboration. You
It wasn’t until I got a job working for a company work. never know where good ideas may come from.
called The Juiceman that I learned just how valu-
able collaboration was. When I started at The Finally, I must mention one of the most intimate
Juiceman, I collaborated with some of the best collaborators I’ve worked with for the last 15+ Rod Paddock
developers I’d ever crossed paths with. Two of years: Greg Lawrence. Greg and I became friends
them, Eric Ranft and Mark Davis, are still friends when he worked at an ISP doing network stuff
to this day and have gone on to do great things and a bit of HTML/JavaScript. It was this work
(check out John Petersen’s column to see what that gave me the inkling that Greg might make
Eric went on to do). a good developer. I took a chance on him and
hired him as my first employee. My inkling was
After my time at The Juiceman, I moved to a com- correct, and Greg did become a great developer,
pany called Pinnacle Publishing. It was at Pin- with whom I’ve been lucky enough to work on
nacle where I established a friendship with Erik some killer applications.
Ruthruff. Erik and I have collaborated on numer-
ous projects for nearly three decades now. These I want to point out one thing that is valuable in
projects include courseware development, build- this particular collaboration: the learning. Not
ing tools for managing conferences, and working Greg’s learning but mine. Over many years, Greg
on this magazine. I’m forever grateful for our col- has taught me a lot when it comes to teaching
laboration and friendship. It’s amazing how much development skills, as well as how to build soft-
a single person can affect your life. ware. His skills as a developer have helped make
my skills better. The apprentice is now the mas-
Soon after leaving Pinnacle, I returned to my life ter, as they say in the Star Wars universe.
as a Lone Wolf, this time as a consultant. This is a
bit of a misnomer. There isn’t really such a thing As many of you already know, I am a huge fan of
as a Lone Wolf consultant because consultants, the movie business and in particular, the process
by definition, work with others. During my years of movie making. The thing that I have learned
as consultant, I had the opportunity to meet and from that industry is just how collaborative it
6 Editorial codemag.com
CUSTOM SOFTWARE DEVELOPMENT
STAFFING TRAINING/MENTORING SECURITY
Contact us today for a complimentary one hour tech consultation. No strings. No commitment. Just CODE.
codemag.com/code
832-717-4445 ext. 9 • info@codemag.com
codemag.com
ONLINE QUICK ID 2205021
Blockchain
When I took up IT as a career, the most appealing thing to me was that I would never get bored in this field. I mean, which other
field has fun names, like “bugs” for defects? And every couple of years, everything changes. After many years in this field, this
still holds true. I keep seeing groundbreaking changes that promise to change our society. Think about the change the Internet
has had on our lives. Think about the change storage shut it down or tamper with it. Bitcoin chooses to use block-
capacity has had on our lives. Now that so much can be chain to safeguard its decentralized currency, but there are
recorded and searched, computers know more about us many other uses of blockchain, and many uses that we’re
than we know about us. Scary. Think about the amount of still figuring out.
change that phone in your pocket has brought forth. Think
about how different the COVID crisis would have been if we What would blockchain mean for proving your identity?
couldn’t work from home, or didn’t have information at our This’s exactly the spirit behind decentralized identities in
fingertips? Or how wars are fought and broadcast on social the Microsoft Identity Platform. Imagine the ramifications
media. where you, the consumer, hold your identity and choose to
Sahil Malik share it with whomever you desire to.
www.winsmarts.com One such technology that has flown under the radar for
@sahilmalik many years, and has become very important recently, is What would it mean for social media? Information could be
blockchain. I thought we should chat about it. shared in a P2P form, and, for better or worse, would be ef-
Sahil Malik is a Microsoft fectively uncensorable.
MVP, INETA speaker,
a .NET author, consultant, What Is Blockchain This sure makes a lot of governments and people in posi-
and trainer.
Believe it or not, this concept, although all the rage today, tions of power quite nervous. Like I said, just as the In-
Sahil loves interacting with was first discussed all the way back in 1982. In a disserta- ternet has had a profound impact on our society, so will
fellow geeks in real time. tion entitled “Computer Systems Established, Maintained, blockchain.
His talks and trainings are and Trusted by Mutually Suspicious Groups”, David Chaum
full of humor and practical first proposed the idea of a blockchain-like protocol. Why the name “blockchain?” A typical database has data
nuggets. in rows and columns. In-memory databases can store ob-
The easiest way to think of blockchain is as a distributed jects, similar to JSON. But blockchain collects information
His areas of expertise are database that’s shared among the nodes of a computer net- together in groups called blocks. These blocks have a de-
cross-platform Mobile app work. Like any database, it stores information in a digital fined storage capacity, and when filled, are closed, given a
development, Microsoft format. The only difference is that this database doesn’t timestamp, and are linked to a previously filled block. This
anything, and security reside on one computer, or even a computer in Azure or creates a chain of blocks, and therefore we call it a block-
and identity. chain. This is illustrated in Figure 1.
AWS; this database is distributed among all the nodes of the
computer network that participate in that blockchain. These
computers can be your laptop, a server somewhere—really
anything.
An alternate name for blockchain is
You’ve probably heard of Bitcoin. Bitcoin is an example of distributed ledger technology or DLT.
a decentralized digital currency. And the Bitcoin system is
a collection of nodes that run Bitcoin code and that stores
data on its blockchain. If you and I exchange some Bitcoin,
this transaction gets written to this distributed database. This also contributes to the irreversible nature of blockchain
And because it’s decentralized, there’s no single party con- data. These chains are replicated across various nodes, and
trolling it. This means, for all practical purposes, that given changing the history of all these nodes is practically impos-
its size, it cannot be censored, tampered with, or taken over sible.
by a single party. This is quite in contrast with, say, a da-
tabase sitting in the cloud. Some hacker could tamper with Think of blockchain as an immutable ledger that cannot be
that database or a central authority could censor it. Or some edited, deleted, or destroyed. An alternate name for block-
disgruntled employee or party could take control of it and chain is distributed ledger technology or DLT.
8 Blockchain codemag.com
How Does Blockchain Work? that can be trusted is a great application for it. Here are
Blockchain is designed for digital information to be record- some examples of blockchain being used today.
ed and distributed, but not edited. When a new transaction
is entered on the blockchain, the transaction is transmit- Cryptocurrencies
ted to a peer-to-peer network of computers. This network of Imagine that you’re in a group of four friends and you fre-
computers then solves equations to confirm validity of the quently exchange money. Maybe you go out for dinner and
transaction. When the transaction is confirmed as a valid one of you foots the bill, for instance. The other three owe
transaction, it’s entered into blocks. Each block contains that person some money. How does this work?
data, hash of the block, and hash of the previous block it’s
linked to. As blocks fill, they’re chained together, creating a Well, you maintain a record, or a ledger of who owes who how
long history of transactions that are permanent. much. Then, there is mutual trust. You have trust, and records,
and this is how you can continue to loan each other money.
The details of the data depend on the kind of blockchain. For
instance, a payment network, such as Bitcoin, could store the You already know that blockchain can give you an untamper-
sender, receiver, and the amount of the transaction. able ledger database. But really, anyone can write anything
to it. What prevents me from writing, randomly, that Alice
The hash acts as a fingerprint for your block. It verifies the owes you money? This is where trust comes in. For that, you
uniqueness and validity of the block. Changing the block add cryptography to it and you get trust. This is the founda-
causes the hash to change, which means that changing the tion of crypto currencies. Anyone can write to the ledger
block data effectively removes it from the chain, making around who owes who how much, but it isn’t until the data
spurious data easy to identify. is signed with a private key that the transaction is trustable.
And the public key allows anyone to verify that the transac-
Similarly, the hash of the previous block creates a chain of tion is legit. The signature, in fact, requires your private
blocks. This means that, like a journaling database, or a led- key, the entry ID, and the contents of the message, so the
ger system, you can’t simply edit one link in the chain and signature is distinct for each transaction.
expect the chain to not detect the problem.
There are many cryptocurrencies, such as Bitcoin, Ethere-
The first block in a blockchain doesn’t point to any previous um, Ripple, Litecoin, and more. In fact, there are more than
block and it’s called the genesis block. 1600 crypto currencies, each with their unique characteris-
tics. They’re all implementations of blockchain.
Now, you may be thinking that new computers, such as that
fancy M1 Ultra Mac, are pretty fast. And Bitcoin is about On top of that, this virtual currency is tied to a real currency to
324GB of data. So why can’t I just completely recalculate the start with. So, let’s say, I buy $100 worth of SahilCoin. I know
entire blockchain’s hashes and make my invalid blockchain this sounds funny, but I could set up a new coin by that name
valid? You could. But there’s an additional layer of protec- and we could start using it between friends <wink wink>. Now,
tion called “proof of work.” This effectively slows down the let’s say that I can’t transact > $100 worth of SahilCoin. This
creation of a new block. For instance, in Bitcoin, every block means that I can’t keep generating signed transactions on the
takes 10 minutes to calculate proof of work for each block ledger once I exceed my allotted value of $100. In order to
and add it to your blockchain. This means that editing the transact money on this SahilCoin network, I can continue to
entire bitcoin blockchain is impractical. To put numbers to settle transactions in SahilCoin. I don’t need banks or wire
it, Bitcoin block size is around 1MB and Bitcoin blockchain transfer fees or intermediaries. Even so, the money has real
is 324GB, which translates to more than six years to tamper value. In fact, currencies such as Bitcoin can be converted
with the entire Bitcoin blockchain in its current size. back and forth to real cash, with the convenience of an ATM.
Yes, there are physical ATMs where you can buy and sell Bitcoin
Yet another mechanism to keep things secure is the fact for cash. Of course, you can do so virtually as well.
that a blockchain is distributed. This means that anyone
who joins the blockchain gets a full copy of the entire block- This is where things get interesting and what differenti-
chain. So, if someone creates a new block, that block is sent ates one coin from another. When an entry is made to the
to everyone on the network, and everyone verifies it as a blockchain, theoretically speaking, everyone should be able
legit block belonging to the existing blockchain. This is then to verify and agree to that entry. But in reality, there are
added to the existing blockchain on everyone’s network, transmission delays on a P2P network. Different crypto cur-
thereby creating a consensus among everyone. rencies use different protocols to balance between speed
and reliability. Additionally, crypto currencies, such as
To tamper with a blockchain, you’d have to recalculate the Ethereum, allow the deployment of smart contracts and
entire blockchain, get over the proof of work hurdle, and decentralized applications, which are bits of code that can
tamper with this blockchain on at least 50% of the P2P net- execute and release cryptocurrency when certain conditions
work. As you can imagine, this is nearly impossible to do are met. Both Bitcoin and Ethereum currently use proof of
on a mature network such as Bitcoin. Again, to give it some work as consensus protocol, but Ethereum will soon move to
numbers, the Bitcoin network has close to 15000 nodes, but proof of stake, which will allow it to be much more scalable,
you can see an up-to-date number at https://bitnodes.io/. secure, and much more energy efficient.
codemag.com Blockchain 9
enough to power Ukraine and Egypt. Much of this energy So as long as you can back your smart contract with ETH
is created from non-renewable sources, which produces a and gas money, and write the logic in a simple language
terrible amount of greenhouse gases. On top of that, the and stick it on the Ethereum blockchain, you’re in business.
massive computation requirements mean that you need to
have powerful hardware, which gets outdated very quickly, Just read that sentence again. Now you see why I picked this
so it produces tons of e-waste. In fact, Tesla started allow- field for work?
ing customers to buy cars using Bitcoin, and they reversed
their stance due to environmental concerns. NFTs
For such a long time, we’ve been used to physical objects
Even though the name says “coin” there is no need for a that are hard to replicate. There is the Mona Lisa, for in-
physical coin involved here. It’s not backed or issued by cen- stance. I’m sure copies exist, but there’s only one original
tral banks or governments, it’s backed by blockchain, and and there’s a way to protect it and detect counterfeits. This
an equivalent cash transaction value that fluctuates based is so much harder in the digital world.
on demand. The currency amount on Bitcoin tokens is kept
using public and private keys. The public key is sort of your Being digital creates so many issues. It serves as a disin-
bank account number, and can serve as an address that you centive to produce original work, because people just copy
use to send or receive Bitcoin. The private key is something it. Just think of all the memes floating around. There’s no
you must never share. It’s what you use to authorize Bit- payment mechanism to the original creator as their cre-
coin transmissions. A Bitcoin wallet, on the other hand, is ation goes viral. And most of all, the recipient has a hard
a secure digital store of information that keeps track of the time trusting what they see and if they can trust it to be
ownership of your coins. original. For instance, a doctored video of the president
saying silly stuff on social media could have serious ramifi-
Smart Contracts cations. Wouldn’t it be nice if the White House could some-
Smart contracts are tiny computer programs that live on a how stamp something as original and untampered with,
blockchain and can disburse a payment once the conditions and, just like you verify a site using SSL, you could verify
of the contract are met. For instance, if there’s a crowd- the originality of content? This is exactly the problem NFTs
funding effort involved, money can be collected from many solve.
people and be disbursed when minimums are met. Or you
can have a collective auction. Or perhaps an inheritance. NFTs stand for non-fungible tokens. In our digital world,
Think of it as a digital vending machine. people copy everything. All it takes is a bunch of keystrokes,
and your digital artwork is now mine with a screenshot. An
Smart contracts on the Ethereum blockchain require you to NFT is simply any binary data put on the blockchain. Given
write the contract in “smart contract language,” which has the characteristics of a blockchain—the fact that it’s immu-
its own very simple syntax, and you need to have enough table and independently verifiable—now, you can prove the
ETH to deploy your contract. Then you have to pay “Gas,” originality of any work. This has pretty significant ramifica-
which is a unit of computational effort to execute opera- tions for ownership rights.
tions on the Ethereum network, to deploy your smart con-
tract. You pay these gas fees also in ETH. NFTs don’t have to be just images. They can be anything.
They can be, for instance, an audio file, a video, or any other
kind of digital artwork. They can be domain names, concert
tickets, objects in the metaverse, really, anything that is
digital or can have a digital representation. Do you remem-
ber a game called FarmVille, where you could buy/sell stuff?
McDonalds just bought some real estate in the metaverse.
Strange times we live in.
You can even buy real-estate—yes, a house you can live in—
using NFTs. Remember, you’re proving ownership. What’s
your house’s current ownership? Who has the deed? Well,
how is that ownership proof not digital? Of course, there are
still issues to be worked out, such as governments recogniz-
ing Bitcoin as official currency or NFTs as a valid equivalent
of a deed. But you can link Bitcoin to dollars and NFTs to
deeds.
10 Blockchain codemag.com
This NFT’s ownership is now sinaEstavi’s and sinaEstavi can,
in the future, choose to sell it to someone else. The transac-
tion to sell the NFT itself can be stored on the blockchain, ®
which proves ownership transfer, again, in a verifiable, non-
tamperable manner. Of course, if sinaEstavi can’t find any
buyers, this NFT becomes worthless, much like objects in the
real world. After all, what’s so special about the Mona Lisa?
You may be thinking that typical blocks in blockchain are Instantly Search
Terabytes
small. For instance, bitcoin block size is just 1MB. In a world
where your phone takes 108MP images, how do you effi-
ciently store them on blockchain? The answer is that the en-
try on the blockchain contains a unique fingerprint (hash),
token name, and symbol, a pointer to the previous block,
and a link to the file on IPFS. IPFS stands for interplanetary
file system. And it looks like ipfs://somelongweirdstring.
Usually when you try to get a file, it looks like https://loca-
tion. That’s called location-based addressing.
dtSearch’s document filters support:
Here in IPFS, you’re using ipfs://hash and providing a hash • popular file types
of the content you’re interested in. This is content based
addressing. This link points you to both the file and the • emails with multilevel attachments
metadata of the file at ipfs://somelongweirdstring/meta-
data.json. There are python and node packages that let you
• a wide variety of databases
decipher this metadata easily, although you can also just • web data
visit https://ipfs.io. When you request an IPFS file, anyone
on the blockchain network who has a copy of that file will
return the file to you, and given that it’s built on block-
chain, you can use the hash to ensure that the file hasn’t
been tampered with.
Over 25 search options including:
• efficient multithreaded search
Also, the metadata can point to a location that is a simple
HTTPS URL or any other mechanism that stores data off the • easy multicolor hit-highlighting
blockchain. This is called off-chain storage, which allows • forensics options like credit card search
you to have a hybrid of blockchain and old-style technolo-
gies and gain the best of both worlds.
codemag.com Blockchain 11
contract to pay the owner if it changes hands. Or for that is at war. Carrying money as crypto is memorizing a phrase.
matter, pay all previous owners, or just the creator, every Or for that matter information. For instance, in 2017, the
time it changes hands, much like a royalty. Turkish government blocked access to Wikipedia as anti-
government. People just put Turkish Wikipedia on IPFS,
Here you are, reading this article. Imagine if this article and good luck blocking that. Another example is https://d.
were an NFT. And every reader who read it sends a royalty of tube/, which is just like YouTube, but built on blockchain.
25 satoshi (the smallest unit of Bitcoin) back to the author. This means no ads, no censorship, and you can’t delete
One bitcoin has 100 million satoshis, and 25 satoshi roughly videos.
translates to one cent. This transaction could be done via
a smart contract, with no middlemen, transaction fees, or Decentralized Identities
even considerable delays. But this kind of microtransaction Identity is an interesting problem. Many years ago, it was a
has historically been nearly impossible to replicate using username/password. We then started coming up with better
the conventional banking system. Can I write a blogpost mechanisms for proving that you had the right password.
where very reader pays me 0.0001 cent? Not in the tradi- Over time, we realized that it was not only hard to keep
tional way. With NFTs or crypto this is possible. And you do this secure, but it was also inflexible. So we created stan-
see examples of crypto tip jars already on so many sites. dards like OpenID Connect, where we delegate or federate
What prevents us from taking the next step and unlocking the process of proving an identity to a well-known identity
smart contracts on content? I am sure it will happen, if it provider, like Twitter, Facebook, Google, Microsoft, etc.
isn’t already.
It’s still not ideal. Why should Facebook or Google or Micro-
I have no misgivings about the capability and risks of this soft hold all my data? Remember, an identity is tied to your
technology. This transaction could cross borders, avoid the profile. Let’s say you go to a doctor and have to prove who
banking system altogether, even circumvent governments. you are. Today, you show a driver’s license or similar form
It makes a lot of people in power very nervous. Sure, this of identification. I’ve always feared what would happen if I
has positives and negatives, and for sure, like any technol- lost my identification when travelling and TSA wouldn’t let
ogy, has potential for misuse or great use. me on the plane. Could I log into my Azure AD account to
prove who I am? It’s certainly enough for my employer, so
This technology, I am sure, is being used to transport il- why isn’t it enough for TSA or my doctor?
licit money, and governments are trying hard to catch up to
it, but there are some really good uses for it too. Imagine Well, it may be enough, but that requires both of you trust-
someone trying to flee their country because their country ing Microsoft to hold your profile, and the government and
its regulations allowing Microsoft to hold every citizen’s
personal information.
Listing 1: Our starter page with web3.js
<!doctype html> Yeah, not gonna happen.
<html lang="en"> This is where decentralized identities come in. Put simply,
<head> they are identity information that you control, backed by
<meta charset="utf-8">
<title>Ethereum demo</title> an attestation authority. And you control what bit of infor-
<script mation you share with whom. For instance, when visiting a
src="https://cdn.jsdelivr.net/npm/ doctor, they don’t need to know your salary. Or while going
web3@latest/dist/web3.min.js"> through TSA, they don’t need to know about that wart that
</script>
</head> itches weird. This is where you can hold your distributed
<body> identity in a wallet, and you share what you deem worth
</body> sharing, yet no central authority holds all your profile in-
</html> formation. You do.
Other uses
Listing 2: Logic to process ethereum blocks The blockchain technology is nascent, but a number of fu-
async function checkCurrentBlock() { ture uses and implementations will occur. How about voting?
const currentBlockNumber = What’s more important that being able to trust our voting
await web3.eth.getBlockNumber(); system? How about notaries? How about tracking goods and
console.log( shippable items that aren’t tied to a single vendor’s tracking
"Current blockchain top: " + currentBlockNumber,
" latest Block Number: " + latestBlockNumber)
system? Perhaps medical records that you, the patient, con-
while (latestBlockNumber == -1 || trol and share as and when necessary in whatever manner
currentBlockNumber > latestBlockNumber) { necessary with your doctor? Vaccination records, taxation
await processBlock( records, so much more. Like many other technologies, the
latestBlockNumber == -1 ? technology runs far ahead and then the regulators catch up.
currentBlockNumber : latestBlockNumber + 1);
} We’re entering an interesting time when governments are
setTimeout(checkCurrentBlock, timeInterval); launching cryptocurrency equivalents of their fiat curren-
} cies. What does mean for the traditional banking system?
async function processBlock(blockNumber) {
console.log("Processing block: " + blockNumber)
latestBlockNumber = blockNumber;
Ethereum Blockchain in JavaScript
} Sometimes it’s important to understand the mechanics and
reasoning behind any new technology before we roll up our
12 Blockchain codemag.com
sleeves and see it in action. The good news is that, like al-
most anything else, a lot of the hard work has been done for
you in reusable libraries.
codemag.com Blockchain 13
ONLINE QUICK ID 2205031
with products and another set for working with customers, Wildermuth’s article entitled “Minimal APIs in .NET 6” in
create a ProductRouter class and a CustomerRouter class. In the last issue of CODE Magazine at https://codemag.com/
this article, you’re going to see how to move each group of Article/2201081/Minimal-APIs-in-.NET-6.
Web APIs into their own router class to provide a much more
consistent and maintainable way to create Minimal Web API This is a lot of code in the Program.cs file for just working
calls. with products. You can imagine how that code grows as you
add the same CRUD logic for customers, employees, books,
or whatever other tables you have in your database. Let’s
A Sample Minimal Web API now look at how to make this code more manageable.
Paul D. Sheriff Let’s look at a simple Minimal Web API system that works
with product data. You normally have a Product class with
www.pdsa.com
basic properties such as ProductID, Name, Color, and List- Create a Web API Project
Paul has been in the IT Price, as shown in the following code snippet. To get the most out of this article, I suggest that you fol-
industry over 35 years. In low along with the steps as I outline them. You need to in-
that time, he has success- public partial class Product { stall .NET 6 on your computer, which you can get at https://
fully assisted hundreds
public int ProductID { get; set; } dotnet.microsoft.com/en-us/download/dotnet/6.0. You
of company’s architect
public string Name { get; set; } also need either VS Code (https://code.visualstudio.com/
software applications to
public string Color { get; set; } download) or Visual Studio (https://visualstudio.microsoft.
solve their toughest busi-
ness problems. Paul has public decimal ListPrice { get; set; } com/downloads). If you wish to use VS Code for your editor
been a teacher and mentor } and your application creation, use the following section for
through various mediums guidance. If you wish to use Visual Studio 2022, skip to the
such as video courses, In the Program.cs file, you write an app.MapGet() method next section for guidance.
blogs, articles and speaking to return a set of Product objects. In this example, I’m using
engagements at user groups hard-coded Product objects, whereas in a real application, Using VS Code
and conferences around you’d most likely use the Entity Framework to retrieve these Open VS Code in the top-level folder where you normally
the world. Paul has many from a database table. create your projects (for example D:\MyVSProjects). Select
courses in the www.plural- Terminal > New Terminal from the menu. In the terminal
sight.com library (http:// // Get a collection of data window in VS Code, create a .NET 6 Web API app using the
www.pluralsight.com/au- app.MapGet("/product", () => { following dotnet command:
thor/paul-sheriff) on topics return Results.Ok(new List<Product> {
ranging from .NET 6, LINQ, new Product { dotnet new webapi -minimal -n AdvWorksAPI
JavaScript, Angular, MVC, ProductID = 706,
WPF, ADO.NET, jQuery, and Name = "HL Road Frame - Red, 58", Select File > Open Folder… from the menu, navigate to the
Bootstrap. Contact Paul at Color = "Red", ListPrice = 1500.00m new AdvWorksAPI folder created by the above command
psheriff@pdsa.com. }, and click the Select Folder button.
new Product {
ProductID = 707, Add Required Assets
Name = "Sport-100 Helmet, Red", At the bottom right-hand corner of Visual Studio Code, you
Color = "Red", ListPrice = 34.99m should see a warning bar appear (Figure 1). This tells you
} that you need to add some required assets. Click the Yes
}); button.
});
This warning box can take a minute to appear; either be pa-
You then have the rest of your app.Map*() methods that tient and wait for it, or you can run a build task by selecting
retrieve a single product, post a new product, update an the Terminal > Run Build Task… > build from the menu bar.
existing product and delete a product, as shown in List-
ing 1. For a great primer on Minimal APIs, check out Shawn Save the Workspace
Click File > Save Workspace As… and give it the name
AdvWorksAPI. Click the Save button to store this new
workspace file on disk.
the folder where you generally create your projects and click
the Next button. From the Framework dropdown list (Figure
2) choose .NET 6.0 (Long-term support). From the Authen-
tication type dropdown list choose None. Uncheck the Use
controllers (uncheck to use minimal APIs) field and click
the Create button.
Try It Out
Whether you’ve used VS Code or Visual Studio 2022, press F5
to build the Web API project and launch a browser. If you get a
dialog box that asks if you should trust the IIS Express certifi-
cate, select Yes. In the Security Warning dialog that appears
next, select Yes. If you get an error related to privacy and/or
HTTPS. Open the \Properties\launchSettings.json file and
remove the “https://...” from the applicationURL property.
namespace AdvWorksAPI {
public class RouterBase {
public string UrlFragment;
protected ILogger Logger;
namespace AdvWorksAPI {
public class ProductRouter : RouterBase {
public ProductRouter() {
UrlFragment = "product";
}
Figure 3: The Swagger home page allows you to try out your API calls. }
}
If you’re wondering what the directive #nullable disable is You can see that this ProductRouter class inherits from the
at the top of the file, .NET 6 requires all empty strings to ei- RouterBase class. It sets the UrlFragment property to the
ther be initialized in the constructor of the class or created value “product” because that’s going to be used for the end-
as a nullable string. If you don’t wish to use this behavior, point for all your mapping methods. Setting this property
include this directive at the top of your file. once helps you eliminate repeated code and gives you one
place to change your route name should you desire.
Create Router Base Class
Any time you’re going to create a set of classes that perform Get All Products
the same basic functionality, it’s a great idea to create either Add a protected virtual method to the ProductRouter class
an interface or a base class to identify those methods and named GetAll() to return a collection of Product objects, as
properties that should be in each class. For each of the router shown in Listing 2. I’m using a hard-coded collection of Prod-
classes, you should have a public property called UrlFragment uct objects here just so you can see the Minimal API in action
that identifies the first part of the path to your API. In the without having to worry about connecting to a database.
example shown in Listing 1, the value /product was repeated
many times in each of the app.Map*() methods. This is the Create a Get() Method to Return the IResult
value that you’re going to put into the UrlFragment property. Next, create a method named Get() that returns an IResult
If you have another router class, Customer for example, you object because that’s what’s expected from a Minimal API.
place the value /customer into this UrlFragment property. The Get() method uses the Results.Ok() method to return
a status code of 200, signifying that the method was suc-
At some point, you might wish to log messages or errors as cessful. The list of Product objects is returned to the calling
your Web API methods are called. Include a protected prop- program, wrapped within this result object.
erty named Logger of the type ILogger in this base class.
The property is to be injected into either the constructor /// <summary>
of your router classes or injected into just those methods /// GET a collection of data
that need it. /// </summary>
/// <returns>An IResult object</returns>
protected virtual IResult Get() {
A single public method, AddRoutes(), is needed in order to return Results.Ok(GetAll());
initialize the routes for router class. This method is called }
from the Program.cs file to initialize the routes you previously
created in the Program.cs file. You’re going to see the use of
this method as you work your way through this article. Create Method to Add Product Routes
You need to inform the Web API engine that this Get()
Right-click on the AdvWorksAPI project and add a new fold- method is an endpoint. To accomplish this, override the
er named Components. Add a new file named RouterBase. AddRoutes() method in the ProductRouter class as shown in
cs and add the code shown in the following code snippet. the following code snippet:
Try It Out
Run the application and, on the Swagger home page, click
on the Get button for the /product path. Click on the Try it
Out button and click on the Execute button. You should see
the list of products you created in the Get() method appear
as JSON in the Response body section of the Swagger page,
as shown in Figure 4.
Get a Single Product Figure 4: The product route now appears on the Swagger page.
In addition to retrieving all products, you’re most likely go-
ing to need to retrieve a single product. Add an overload of
the Get() method to the ProductRouter class that accepts a /// <returns>An IResult object</returns>
single integer value named id. Use this id variable to search protected virtual IResult Get(int id) {
in the Product collection for where the id value matches one // Locate a single row of data
of the ProductId property values. The Product object that’s Product? current = GetAll()
located is returned from this method wrapped within the .Find(p => p.ProductID == id);
Results.Ok() object. If the id value isn’t found, a Results. if (current != null) {
NotFound() is returned that’s reported as a 404 Not Found return Results.Ok(current);
status code back to the calling program. }
else {
/// <summary> return Results.NotFound();
/// GET a single row of data }
/// </summary> }
{
"productID": 711,
"name": "A New Product",
"color": "White",
"listPrice": 20
}
Update a Product
The public interface for updating an entity through a Web
Figure 5: Swagger allows you to enter an ID to call the Get(id) method. API method is to pass in the ID of the object to update along
Try It Out
Run the application and on the Swagger home page, click on
the Get button for the /product path. Click on the Try it Out
button, then click on the Execute button. You should then
see the list of products you created in the Get() method ap-
pear as JSON in the Response body section of the Swagger
page. Look in the Console window and you should see the
message “Getting all products” has appeared.
Change-Tracking Mixed-State
Graphs in EF Core
Real life relationships can be hard and sometimes, in EF Core, they can be hard as well. EF Core’s change tracker has very specific
behavior with respect to related data but it may not always be what you expect. I want to review some of these behaviors so
you have a bit of guidance at hand, although I always recommend that you do some integration testing to be sure that your
EF Core code does what you’re anticipating. There are a My context is configured to expose DbSets for Author and
number of tools at hand to help you out. In fact, you could Book, configure my SQLite database, and seed some author
discover the behavior without ever calling SaveChanges be- and book data. If you want to try this out, the full code is
cause the key to the behavior is in the change tracker itself. available in the download for this article and on a reposito-
Whatever SQL it executes for you is simply a manifestation ry at github.com/Julielerman/CodeMagEFC6Relationships.
of the knowledge stored in the change tracker. However,
I still need a database to perform queries, so I’ll use the There is so much behavior to explore with this one set up.
SQLite provider. Why not InMemory? Because some of its But it’s also interesting to experiment with different com-
persistence behaviors are different from a real database. For binations of navigation properties and foreign keys. For ex-
Julie Lerman example, the InMemory provider updates key properties for ample, if Book had AuthorId but not an Author navigation
@julielerman new objects when they’re tracked, whereas for many data- property, some behavior will be different. It’s also possible
thedatafarm.com/contact bases, those keys aren’t available until after the database to minimize the classes and define relationships in the Flu-
generates key values for you. ent API mappings; for example, you could remove the Books
Julie Lerman is a Microsoft
property from Author, and the Author and AuthorId proper-
Regional director, Docker
In a previous CODE Magazine article called Tapping into ties from Book and still have a mapped relationship.
Captain, and a long-time
EF Core’s Pipeline (https://www.codemag.com/Arti-
Microsoft MVP who now
counts her years as a coder cle/2103051), one of those taps I wrote about was the Ch- And then even more behavior differences are introduced with
in decades. She makes angeTracker.DebugView introduced in EF Core 5. I’ll use that nullability. For example, Book.AuthorId is a straightforward
her living as a coach and to explore the change tracker as I walk through a number of integer which, by default, is non-nullable. There’s nothing
consultant to software persistence scenarios with related data. here to prevent you from leaving AuthorId’s value at 0. How-
teams around the world. ever, the default mappings infer the non-nullable AuthorId to
mean that, in the database, a Book must have an Author and
You can find Julie presenting
on Entity Framework,
Starting with a Simple One-to-Many therefore AuthorId can’t be 0. Your code must control that
Domain-Driven Design and For this example, I’ll adopt the small book publishing house rule to avoid database inconsistencies (and database errors).
other topics at user groups data model from my recently released Pluralsight Course, EF
and conferences around Core 6 Fundamentals. This publisher only publishes books My goal here is to show you that there are so many varia-
the world. Julie blogs at written by one author, therefore I have a straightforward tions to persist data just on this one specific setup and leave
thedatafarm.com/blog, one-to-many relationship between author and book. One you with the knowledge and tools to determine what to ex-
is the author of the highly author can have many books but a book can only ever have pect from your own domain and data models.
acclaimed “Programming one author. My initial classes are defined in the most com-
Entity Framework” books, mon way, where Author has a list of Books and the Book
and many popular videos type has both a navigation property back to Author along Persisting When Objects are Tracked
on Pluralsight.com. with an AuthorId foreign key. Whether or not the change tracker is already aware of the
related data affects how it treats that data. Let’s look at a
How your classes are designed can impact behavior. This few scenarios where a new book is added to an author’s col-
sample is an explicit choice for a stake in the ground of lection of books while these objects are being tracked by an
what to expect from the change tracker. in-scope DbContext.
public class Author In this first scenario, I’ve used an instance of PubContext
{ to retrieve an author from the database with FirstOrDefault
public int AuthorId { get; set; } query. The context stays in scope and is tracking the author.
public string FirstName { get; set; } I then create a new book and add it to the author’s Books list.
public string LastName { get; set; } Then, instead of calling SaveChanges, I’m calling Change-
public List<Book> Books { get; set; } = Tracker.DetectChanges to get the change tracker to update its
new List<Book>(); understanding of the entities it’s tracking. SaveChanges in-
} ternally calls DetectChanges, so I’m just using it explicitly and
public class Book avoiding an unneeded interaction with the database. Then
{ I use ChangeTracker’s DebugView.ShortView to get a simple
public int BookId { get; set; } look at what the context thinks about its entities.
public string Title { get; set; }
public Author Author { get; set; } void AddBookToExistingAuthorTracked()
public int AuthorId { get; set; } {
} using var context = new PubContext();
There’s something else interesting to note. The book’s Au- All of these details are hard to keep in your head, even if you
thorId is now 2. Recall that the book came in without any knew them once or twice before! I always create integration
AuthorId property. Because I passed the entire graph into tests to make sure I haven’t forgotten a behavior.
the context, when I called DetectChanges, EF Core figured
out that because of the relationship, the Book.AuthorId Taking More Control over the Graph’s State
should use the key value from the Author object. There’s a way to make this pattern work, however: By ex-
plicitly setting the foreign key property because you do
The Add method is a problem because I can’t insert that author have easy access to it. DetectChanges is redundant because
into the database. That will create an error in the database. the Add method set the state of the Book immediately. Of
course, SaveChanges will call that anyway, but again, it’s an
Using Update with the Mixed State Graph important behavior to be aware of.
What’s my next option? Well, another conclusion might be
that the author has changed because they have a new book. var book = existingAuthor.Books[0];
This reasoning might lead me to the Update method. book.AuthorId = existingAuthor.AuthorId;
context.Add(book);
//context.ChangeTracker.DetectChanges();
context.Authors.Update(existingAuthor);
One thing I like about just setting the foreign key is that I’m not
This isn’t the correct path either. You saw this problem earli-
relying on “magic” to have success with my persistence logic.
er. Explicitly calling Update causes every object in the graph
to be marked as Modified (except for any that don’t have a
key value). Again, the Author would be marked Modified and Tracking Single Entities with
the Book marked Added and you’ll get a needless command
to update all of the Author’s properties sent to the database.
the Entry Method
Here’s another place you may be surprised with how EF Core
reacts to our incoming graph.
On the other hand, if you know that Author was updated, or
you’re not concerned about the extra database trip or about
Given that I’m a fan of explicit logic, I’m also a fan of the
audit data, Update would be a safe bet.
very explicit DbContext.Entry method. The beauty of the
Entry method is that it only pays attention to the root of
Focusing on the Graph’s Added Object
whatever graph you use as its parameter. It’s the only clean
You learned (above) that without DetectChanges, the Up-
way to separate an entity from a graph with EF Core and, be-
date (and Add and Remove) methods only acknowledge the
cause of this strict behavior, you don’t have to make guesses
root of the graph. So what if I pass in the book to context.
about what will happen within a graph.
Add method instead of the author?
Yet, it still may surprise you with the mixed state graph.
var book = existingAuthor.Books[0];
When I use Entry to start tracking the book in my graph:
context.Add(book);
//context.ChangeTracker.DetectChanges();
var book = existingAuthor.Books[0];
context.Entry(book).State=EntityState.Added;
Because the tracker is only aware of the book, it isn’t able to read
the Author’s key property and apply it to Book.AuthorId as you the Entry method ignores the author that’s connected to that
saw it do earlier. The DebugView shows that AuthorId is still 0: book. As expected, it sets the state of that book to Added.
But because the ChangeTracker is now unaware of the Author
Book {BookId: -2147482647} Added FK {AuthorId: 0} object, it can’t read existingAuthor.AuthorId to set the book’s
foreign key property and therefore, book’s AuthorId is still 0.
What if I added the DetectChanges back into the logic? Well, there’s
a surprise. That doesn’t work either! Book.AuthorId is still 0. Book {BookId: -2147482647} Added FK {AuthorId: 0}
The fact that DetectChanges doesn’t fix the foreign key also As you just learned above, your code needs to take respon-
means that calling SaveChanges—which calls DetectChang- sibility for the foreign key. Therefore, I’ll just set it myself
es—causes the resulting database command to fail because before calling the Entry method:
my design is such that a Book must have an Author. An Au-
thorId value of 0 causes a foreign key constraint error in the var book = existingAuthor.Books[0];
database. Notice that the failure is in the database. EF Core book.AuthorId= existingAuthor.AuthorId;
won’t protect you from making this mistake which means that context.Entry(book).State=EntityState.Added;
again, you need to ensure that your code enforces your rules.
Now there’s no question about AuthorId being 2.
Did you find it strange that pushing the graph into the context
using the author object pulled in the entire graph but pushing Book {BookId: -2147482647} Added FK {AuthorId: 2}
Dealing with Unknown State If you start tweaking other factors, such as removing navi-
I’ve pointed out a few times now that there’s a problem with gation properties or FK properties or changing the nul-
calling Update on entities that haven’t been edited. They get lability of the foreign key, you’ll have another batch of
sent to the database with an UPDATE command, which can be resulting behavior to be aware of.
a wasted call and possibly have a bad effect on performance.
Remember that I chose to only use DetectChanges directly
However, there’s an interesting use for Update: when you and not SaveChanges. That’s a nice way of testing things
have objects coming in and you have no idea if they need out without bothering with the database. Some of those
to be added, updated, or ignored. Notice that I am leav- scenarios where it was necessary to call DetectChanges
ing “deleted” out of that list. You must supply some way to get the expected behavior will be solved by calling
of detecting whether an object is to be deleted. In typical SaveChanges.
controller methods, including those generated by the Visual
Studio templates, you do have explicit methods for insert- As I stated earlier, I can never keep all of these behaviors
ing, updating, and deleting, so there’s no mystery. in my head. I depend on integration tests to make sure
things are working as I expect them to. And because I’m
What if your logic is different from these controller methods not using a database in any of the examples above, you
and a request passes in an Author object with—or maybe can’t write tests without a database provider, not even the
even without—attached objects. I’ll use the same code that InMemory provider. You can just build assertions about the
represents an existing Author to demonstrate: state of the entities within the ChangeTracker.
• I built a new Issue object with a title and descrip- In the past when I’ve described Marten to other developers,
tion, plus marked it as being open. I also added an they frequently say “but you can’t query within the JSON
initial task within the Issue. data itself though, right?” Fortunately, Marten has robust
• I created a new IDocumentSession object (“session” support for LINQ querying that happily queries within the
in the code up above) that you’ll use to both query a stored JSON data. As an example, let’s say that you want to
Marten database and persist changes. The IDocument- query for the last 10 open issues with a LINQ query:
Session both implements the unit of work pattern to
govern logical transaction boundaries and represents var openIssues = await session
a single connection to the underlying database, thus .Query<Issue>()
making it important to ensure that the session is dis- .Where(x => x.IsOpen)
posed to release the underlying database connection .OrderByDescending(x => x.Opened)
when you’re done with the session. .Take(10)
• I explicitly told Marten that the new Issue document .ToListAsync().ConfigureAwait(false);
should be persisted as an “upsert” operation.
• I committed the one pending document change with As I’ll show later in this article, it’s not only possible to
the call to SaveChangesAsync(). query from within the structured JSON data, but you can
also add computed indexes in Marten that work within the
It may be more interesting, so let’s talk about what I did not stored JSON data.
have to do in any of the code above.
Admittedly, Marten’s LINQ support is short of what you may
I didn’t have to write any explicit mapping of the Issue doc- be used to with Entity Framework Core or the older NHiber-
ument type to any kind of Postgresql table structure. Marten nate tooling, but all the most common operators and usages
stores the document data as serialized JSON, so there isn’t of Where() clauses are well supported. Marten also has some
a lot of code-intensive mapping configuration like you’d specific extensions for LINQ that many users find useful.
frequently hit with Object Relational Mappers like Entity
Framework Core or the older NHibernate.
Relations Between Documents
There was no need to first perform any kind of database The sweet spot for document database approaches, like Mar-
schema migration to set up the underlying Postgresql data- ten’s, is when the entities are largely self-contained with few
base schema. Using Marten’s default “development friendly” relationships between different types of entities. Because Mar-
configuration that you used to construct the DocumentStore ten is built on top of a relational database engine, it still has
up above, Marten quietly builds the necessary database tables the ability to enforce relational integrity between entity types.
and functions to store the Issue documents on behind the
scenes the first time you try to write or read Issue documents. To illustrate this, let’s introduce a new User document type
As a developer, you can focus on just writing functionality and within our issue tracking system:
let Marten deal with the grunt work of building and modify-
ing database schema objects. Again, compare that experience public class User
with Marten to the effort you have to make with Object Rela- {
tional Mapper tools to craft database migration scripts. public Guid Id { get; set; }
public string FirstName { get; set; }
Nowhere in the code did I have to assign an identity (pri- public string LastName { get; set; }
mary key) to the new Issue document. Marten’s default as- public string Role { get; set; }
sumption is that a public property (or field) named Id is the }
identity for a document type. Because Issue.Id is of type
GUID, Marten automatically assigns a sequential GUID for Now, I’d like all of the Issue documents to refer to both an
new documents passed into the IDocumentSession.Store() assigned user and to the original user who created the is-
method that don’t already have an established identity. In sue. I’ll add a pair of new properties to the Issue document:
this case, Marten happily sets the value of Id onto the new
Issue document in the course of the Store() method. public class Issue
{
To illustrate the identity behavior, let’s immediately turn public Guid Id { get; set; }
around and load a new copy of the new Issue document with
this code: public Guid? AssigneeId { get; set; }
public Guid? OriginatorId { get; set; }
// Now let's reload that issue
var issue2 = await session // Other properties
.LoadAsync<Issue>(issue.Id) }
.ConfigureAwait(false);
To create foreign keys from the Issue document type to the new
So far, you’ve seen nothing that would be difficult to repro- User document type, I need to revisit the DocumentStore boot-
duce on your own. After all, you’re just saving and loading strapping from before and use this code to configure Marten:
The introduction of the new User document type and the In the query above, Marten stores the related User docu-
foreign key relationships from Issue to User will require ments in the Users dictionary by the User.Id.
changes to the underlying database, but not to worry, be-
cause Marten detects that for you and happily makes the As an aside, the Include() operator is specific to Marten
necessary database changes for you the first time you read (other .NET tools have similar capabilities, and Marten’s
or write Issue documents. support was itself inspired by RavenDb’s equivalent fea-
ture). When using Marten, it’s important to consider wheth-
Foreign key relationships with Marten will work exactly as er or not any generalized abstraction that you place around
you’d expect, if you have any experience with relational da- Marten to avoid vendor lock-in may eliminate the ability to
tabases, as shown in this code: use the very advanced features of Marten that will make Why a Document
your system perform well. Database?
var issue = new Issue
From my own experience,
{
document databases can
// reference a non-existent User Unit of Work Transactions with Marten sometimes enable much
AssigneeId = Guid.NewGuid() As stated earlier, the Marten IDocumentSession is an better developer productivity
}; implementation of the unit of work pattern. According to by eliminating so much of
the original statement by Martin Fowler, the unit of work: the code ceremony that
session.Store(issue); Maintains a list of objects affected by a business trans- is forced upon you by the
action and coordinates the writing out of changes and RDBMS + ORM combination.
// This call will fail! the resolution of concurrency problems.
await session.SaveChangesAsync() Because there’s less effort
.ConfigureAwait(false); Let’s jump right into a contrived example that shows an necessary to map your
IDocumentSession variable named session used to cre- object model in code to
Maybe more interesting is the ability in Marten to fetch re- ate and commit a single database transaction that de- the underlying storage,
lated documents when querying within one document type. letes some Issue documents, stores changes to User it’s far easier to iterate or
For example, let’s say that you’re building a Web service documents, and stores a brand-new issue in one single evolve your object model
over time compared to
where you’ll be making the same query for the 10 most re- transaction:
the more traditional relational
cent open issues, but this time, you also need to query for
database approach.
the related User documents for the people assigned to these session.Delete<User>(oldUserId);
issues. Document databases are
session especially effective with
You could use two separate queries, like this: .DeleteWhere<Issue> complex, hierarchical data
(x => x.OriginatorId == fakeUserId); structures that can often be
var openIssues = await session.Query<Issue>() a poor fit in relational models.
.Where(x => x.IsOpen) // store some User documents In addition, document
.OrderByDescending(x => x.Opened) session.Store(newAdmin, reporter); databases excel with
.Take(10) polymorphic collections that
.ToListAsync().ConfigureAwait(false); // store a new Issue frequently bedevil ORMs.
session.Store(new Issue
// Find the related User documents {
var userIds = openIssues Title = "Help!"
.Where(x => x.AssigneeId.HasValue) });
.Select(x => x.AssigneeId.Value)
.Distinct() await session.SaveChangesAsync()
.ToArray(); .ConfigureAwait(false);
var users = await session Hopefully, that looks very straightforward, but there are a
.LoadManyAsync<User>(userIds) couple of valuable things to note that set Marten apart from
.ConfigureAwait(false); some other alternative document databases:
The general rule of thumb for better performance using • Marten is happily able to process updates to multiple
Marten is to reduce the number of round trips between the types of documents in one transaction.
application and database server, so let’s use Marten’s In- • By virtue of being on top of Postgresql, Marten has
clude() functionality to fetch the related User documents ACID-compliant transactional integrity where data is
within the same round trip to the database, like this: always consistent, as opposed to the BASE model of
Next up, let’s eliminate the need to deserialize the Issue Does reading that list kind of make you a little tired? It does
document data and do the in-memory mapping to the Is- me. The point being here that LINQ querying comes with
sueView structure. You can simply do a LINQ Select() trans- some significant performance overhead. That being said, I’ll
form like this: argue until I’m blue in the face that LINQ is one of the very
best features of .NET and a positive differentiator for .NET
[HttpGet("/issues/open/user/{userId}")] versus other platforms.
public Task<IReadOnlyList<IssueView>>
GetOpenIssues( Fortunately, Marten has a feature we call “compiled que-
Guid userId, ries” that lets you have all the good parts of LINQ without
[FromServices] IQuerySession session) incurring the performance overhead. Let’s take the LINQ
{ query above and move that to a compiled query class called
return session.Query<Issue>() OpenIssuesByUser, as shown in Listing 2.
.Where(x => x.AssigneeId == userId
&& x.IsOpen) Moving to the compiled query turns the controller method
.OrderBy(x => x.Opened) into this code:
Using Cosmos DB in
.NET Core Projects
If you’re a .NET developer, like me, you’ve likely been used to storing your data as relational data even in cases when it wasn’t the
most logical way to store state. Changing our thinking about relational and non-relational stores has been going on for some
time now. If you’re building Azure hosted projects and have a need for document-based storage, Cosmos DB is a great way
to gain high-availability and redundancy. In this article, I’ll think about data in different way. Mechanisms like Object
show you what Cosmos DB is and how you can use the SDK to Relational Mappers (ORM) have tried to hide this difference
store, search, and update your own documents in the cloud. from developers. Understanding the basics of relational da-
tabases like schema, constraints, keys, transactions, and
isolation level are often lost for the sake of quickly getting
What Is Cosmos DB? up to speed and getting projects completed. This has left
In a world where NoSQL databases are a dime a dozen, Cos- many developers holding onto relational databases as their
mos DB is a different beast. Although at its core, it’s just a one and only way to store data.
document database, Cosmos DB is a hosted data platform
Shawn Wildermuth for solving problems of scale and availability. Ultimately, it’s
shawn@wildermuth.com a document database as a service that supports low latency, Using Cosmos DB with .NET Core
wildermuth.com high availability, and geolocation. With features like SLA- Although Cosmos DB provides several mechanisms to con-
@ShawnWildermuth backed availability and enterprise-level security, small and nect to the service (listed above), this article focuses on
large businesses can rely on the Azure deployed service. accessing the service with a .NET Core project. The .NET Core
Shawn Wildermuth has
been tinkering with com- Cosmos DB library supports documents to be stored in Cos-
puters and software since Cosmos DB is accessible through a variety of APIs and lan- mos DB.
he got a Vic-20 back in the guage integration. In general, you can use the following
early ’80s. As a Microsoft ways to interact with Cosmos DB: What do I mean by documents? If you’re coming to Cosmos
MVP since 2003, he’s also DB from a traditional relational database, you’re used to
involved with Microsoft • SQL API (via Libraries) thinking about data in a two-dimensional matrix (that is, a
as an ASP.NET Insider and • MongoDB wrapper table). Tables store data in rows made up of columns that
ClientDev Insider. He’s • Cassandra wrapper are typically (but not always) primitives (such as strings,
the author of over twenty • Goblin wrapper numbers, etc.). In order to include complex objects, tables
Pluralsight courses, written • Table API are related to each other through foreign keys, as seen in
eight books, an interna- • Entity Framework Core provider Figure 1.
tional conference speaker,
and one of the Wilder If you’re already using MongoDB, Cassandra, or Goblin, you In document stores, the data is stored as a single entity.
Minds. You can reach can use Cosmos DB as a drop-in replacement via these APIs. Typically, they are atomic but because they can store more
him at his blog at Essentially, CosmosDB has compatible APIs to support using complex objects, the type of data you can store is more
http://wildermuth.com. a connection to Cosmos DB. expressive. For many solutions, document databases make
He’s also making his first, more sense. This isn’t a matter of one model being better
feature-length documentary
Cosmos DB also supports a Table API, which can be a good than the other, rather that for some situations, document
about software developers
replacement for Azure Table Storage. If you’re thinking that databases make more sense.
today called “Hello World:
you came here to replace your relational database (e.g.,
The Film.” You can see
more about it at SQL Server, Postgres, etc.), that’s not really what it’s about. It’s thought that because of the object orientation of many
http://helloworldfilm.com. Let’s talk about NoSQL versus relational data stores first. languages we use, document databases make more sense;
It’s easy to think about data storage as relational databases but that’s a bad reason to use a document database. In-
first because that’s likely many developers’ first experience stead, you should look at the use of the data. When you’re
with storing data. storing something like customers and orders, relational
could make more sense, as those relationships are impor-
Although many developers (including .NET developers) think tant to enforce in the database server. Being able to reason
about the world in terms of objects, relational data stores about the kinds of data stored often makes relational stores
more logical.
The Emulator Currently, the emulator only works on Windows, but you can
Azure’s Cosmos DB is a hosted service. This service is meant connect to it from Mac environments (see https://shawn.ink/
to be used so you can gain from the sheer scale that Cos- cosmosdb-on-mac for more information).
The main Web page of the emulator shows you the connec-
tion information you can use to connect to the Cosmos DB
instance. For this article, I’ll be using the connection string,
as seen in Figure 3.
Getting Started
As I stated earlier, there are multiple ways to access Cosmos
DB, but for this article, I’m focusing on the Azure.Cosmos
NuGet package. The first thing is to add the package to your
project, as seen in Figure 4. Note that as of the writing of
this article, the v4 of this package is in preview, so you’ll
need to check Include prerelease to see the latest version
Figure 2: Azure Cosmos DB emulator of the package.
{ var connString =
“Logging”: { _config.GetConnectionString(“Cosmos”);
“LogLevel”: { _client = CosmosClient(connString);
“Default”: “Information”,
“Microsoft.AspNetCore”: “Warning” You first get the connection string from the IConfiguration
} object that you injected into the constructor, then just cre-
}, ate the new client as necessary.
“AllowedHosts”: “*”,
“ConnectionStrings”: { Creating Databases and Collections
“Cosmos”: “YOUR CONNECTION STRING” In Cosmos DB, a database is a container for collections of
} data. These two terms are just part of the hierarchy, as seen
} in Figure 5.
Of course, you can store this in any way you see fit, but Before you can store documents, you need a database and
for my purposes, I’ll include it here. You’re now ready to a container. You could create these in the user interface of
start working with it. For this example, I’m going to use a the emulator, but I suggest you do it with code. First, you
repository pattern to provide access to the data. Start with need to define a database name and a name for the con-
a pretty simple class: tainer for the data:
builder.Services.AddScoped<TelemetryRepository>();
Finally, in my API call, I’ll just inject it in the Minimal API (see
my prior article for more information on that - https://www.
codemag.com/Article/2201081/Minimal-APIs-in-.NET-6):
app.MapGet(“/meter/telemetry”,
async (TelemetryRepository repo) =>
{
return Results.Ok(await repo.GetAll());
});
To use queries in the Cosmos DB SDK, you can create a var query = new QueryDefinition(sql)
QueryDefinition by using the SQL text. Notice that in this .WithParameter(“@monitorId”, monitorId);
example, I’m using a parameter (Cosmos DB, just like any
other SQL, shouldn’t use concatenated strings—please pa- var results = new List<Telemetry>();
rameterize your queries):
var iterator = container
var sql = $"SELECT * FROM c WHERE c.id = @id"; .GetItemQueryIterator<Telemetry>(query);
There is also support for Upsert, which creates the object if Shawn Wildermuth
it doesn’t exist and replaces it if it does, like so:
MVC in a Nutshell client for a specific View, it coordinates with the Model com-
ponent to query for data (or update data), then it decides
Model-View-Controller (MVC) is both a design pattern and which View component to return, and finally, it packages the
architecture pattern. It’s seen as more of an architectural View together with the related data into a single response.
pattern, as it tries to solve these problems in the applica-
tion and affects the application entirely. Design patterns One component often overlooked or perceived as part of the
are limited to solving a specific technical problem. Controller is the Routing Engine. It’s the brains behind the
MVC pattern and one of the most important components in
MVC divides an application into three major logical sections: MVC that initially receives the request from the client and
Bilal Haidar allocates which Controller is going to handle the request.
bhaidar@gmail.com • Model
https://www.bhaidar.dev • View Figure 1 shows all of the components, together with their
@bhaidar • Controller relationships, that make up the MVC pattern.
Bilal Haidar is an
accomplished author, The Model component governs and controls the application I can explain Figure 1 as follows:
Microsoft MVP of 10 years, database(s). It’s the only component in MVC that can in-
ASP.NET Insider, and has teract with the database, execute queries, retrieve, update, • The browser (client) requests a page (view).
been writing for CODE delete, and create data. Not only that, but it’s also responsi- • The router engine, living inside the application, re-
Magazine since 2007. ble for guaranteeing the evolution of the database structure ceives the request.
from one stage to another by maintaining a set of database • The router engine runs an algorithm to pick up a sin-
With 15 years of extensive migrations. The Model responds to instructions coming from gle Controller to handle the request.
experience in Web develop- the Controller to perform certain actions in the database. • The Controller decides on the View to return and com-
ment, Bilal is an expert in municates with the Model to retrieve/store any data
providing enterprise Web The View component generates and renders the user inter- and sends a response back to the browser.
solutions. face (UI) of the application. It’s made up of HTML/CSS and • The Model communicates with the database, as need-
possibly JavaScript. It receives the data from the Controller, ed.
He works at Consolidated
which has received the data from the Model. It merges the • The View renders the page (view) requested by the
Contractors Company in
data with the HTML structure to generate the UI. browser.
Athens, Greece as a full-
stack senior developer.
The Controller component acts as a mediator between the Now that you know how MVC works, let’s see how PHP Lara-
Bilal offers technical View and Model components. It receives a request from the vel implements MVC.
consultancy for a variety
of technologies including
Nest JS, Angular, Vue JS,
JavaScript and TypeScript.
To understand how PHP Laravel implements MVC, I’ll go • The \app\Models\Post.php Model class
through a typical Laravel project structure and show you how • An anonymous migration file for the Posts table lo-
the Laravel team bakes the MVC concepts into the framework. cated under \database\migrations\ directory
• The \database\factories\PostFactory.php Factory
Let’s get started by creating a new PHP Laravel project lo- class
cally. To avoid repetition, I’ll point you to a recent article I
published in CODE Magazine Nov/Dec 2021, where I show The command generates the Post.php model file:
you a step-by-step guide on creating a Laravel application.
You can follow this article here: https://www.codemag. namespace App\Models;
com/Article/2111071/Beginner%E2%80%99s-Guide-to-De-
ploying-PHP-Laravel-on-the-Google-Cloud-Platform. use Illuminate\Database\Eloquent\Factories\HasFactory;
use Illuminate\Database\Eloquent\Model;
The latest official version of Laravel is v9.x.
class Post extends Model
Model {
The first component is the Model or M of the MVC. The Model use HasFactory;
plays the main role of allowing the application to communi- }
cate with the back-end database. Laravel includes Eloquent.
Eloquent is an object-relational mapper (ORM) that makes it Notice how the Post class extends the base class Model.
easy to communicate with the back-end database. This is how the Post class inherits all methods and prop-
erties from the base class and allows your application to
In Laravel, you create a Model class for each and every da- interact with the database table posts. Also, the Post class
tabase table. The Model class allows you to interact with the uses the HasFactory trait. This is needed to link Post and
database table to create, update, query, and delete data in PostFactory classes.
the database.
Listing 2: Adjusted up() method Locate and open the Post.php file and add the following
public function up() relationship:
{
Schema::create(‘posts’, public function user():
static function (Blueprint $table) { \Illuminate\Database\Eloquent\Relations\BelongsTo
$table->id();
$table->string(‘slug’); {
$table->string(‘title’); return $this->belongsTo(User::class);
$table->string(‘body’); }
$table->unsignedBigInteger(‘user_id’);
$table->date(‘published_at’)
->useCurrent();
$table->timestamps(); The function user() represents the relationship between the
two models. A Post belongsTo a User.
$table->foreign(‘user_id’)
->references(‘id’) The inverse of this relationship goes inside the User.php
->on(‘users’);
} file. Let’s add the following relationship:
);
} public function posts():
\Illuminate\Database\Eloquent\Relations\HasMany
{
return $this->hasMany(Post::class);
I’ve added the following columns: }
DB_CONNECTION=mysql
DB_HOST=mysql Advertisers Index
DB_PORT=3306
DB_DATABASE=laravel_mvc_app CODE Consulting
DB_USERNAME=sail www.codemag.com/code 7
DB_PASSWORD=password
CODE Consulting
Add the following section instead: www.codemag.com/onehourconsulting 75
CODE Legacy Modernize
DB_CONNECTION=sqlite
www.codemag.com/modernize 49
DB_DATABASE=/var/www/html/database/database.sqlite
CODE Legacy Beach
The reason for this /var/www/html is that you’re using www.codemag.com/modernize 76
Laravel Sail (https://laravel.com/docs/9.x/sail).
Component Source
www.componentsource.com/compare 65
Now, inside the /database folder at the root of the applica- Advertising Sales:
tion, create a new empty file named database.sqlite. DevIntersection Tammy Ferguson
832-717-4445 ext 26
www.devintersection.com 2 tammy@codemag.com
And that’s it!
dtSearch
www.dtSearch.com 11
Eloquent Migrations
Switch to the terminal and run the following command to Knowbility
migrate the database. www.knowbility.org/AIR 43
The command runs all of the migration files and creates the This listing is provided as a courtesy
necessary database tables and objects. to our readers and advertisers.
The publisher assumes no responsibi-
In this case, two tables in the database are created: Users lity for errors or omissions.
and Posts. Figure 2 shows the Posts table structure.
To connect to Laravel Tinker, run the following command: You’re using the User Factory (that ships with any new Lara-
vel application) to create five User records using fake data.
sail artisan tinker Figure 4 shows the result of running the command inside
Tinker.
Or
The statement runs and displays the results of creating the
php artisan tinker five User records.
Figure 3 shows the tinker up and running. Let’s create a new Post record, again using the Post Factory.
Run the following command: You can extend this property by adding more columns to
your Model.
Post::factory()->create()
However, Eloquent offers a more generic way of defining
Figure 5 shows the result of running the command inside such conversions. It allows you to define the $casts prop-
Tinker. erty on your Models and decide on the conversion. For ex-
ample, here, you’re defining a new cast to date:
Inside Tinker, you can run any Eloquent statement that
you’d usually run inside a Controller, as you’ll see soon. protected $casts = [
‘published_at’ => ‘date’
For that, let’s try to query for all Post records in the data- ];
base using the following Eloquent query:
This works great! You can read more about casts in Laravel here
Post::query() (https://laravel.com/docs/9.x/eloquent-mutators#attribute-
->with(‘user’) casting).
->where(‘id’, ‘>’, 1)
->get() I tend to enjoy the flexibility and more control that acces-
sors and mutators in Laravel give me. Let’s have a look.
The query retrieves all Post records with an ID > 1. It also
makes use of another Eloquent feature, eager loading, to Let’s define an accessor and mutator, in the Post.php, to
load the related User record and not only the user_id column. store the published_at column in a specific date format and
retrieve it in the same or some other date format.
Figure 6 shows the query results inside Tinker:
public function publishedAt(): Attribute
Notice that not only the user_id is returned but also an- {
other property named user that contains the entire user return Attribute::make(
record. You can learn more about the powerful Eloquent ea- get: static fn ($value) =>
ger loading here (https://laravel.com/docs/9.x/eloquent- Carbon::parse($value)?->format(‘Y-m-d’),
relationships - eager-loading). set: static fn ($value) =>
Carbon::parse($value)?->format(‘Y-m-d’)
That’s all for the Artisan Tinker for now! );
}
Casting Columns
Eloquent has powerful and hidden gems that are baked into This is the new format for writing accessors and mutators
the framework. For instance, by default, Eloquent converts in Laravel 9. You define a new function using the camelCase
the timestamps columns created_at and updated_at to in- version of the original column name. This function should
stances of Carbon (https://carbon.nesbot.com/docs/). return an Attribute instance.
Laravel defines a property named $dates on the base Model An accessor and mutator can define only the accessor,
class that specifies which columns should be handled as dates. only the mutator, or both. An accessor is defined by the
get() function and the mutator is defined by the set()
protected $dates = [ function.
‘created_at’,
‘updated_at’, The Attribute::make() function takes two parameters: the
‘deleted_at’ get() and set() functions. You can pass one of them, or both
]; of them, depending on the use case.
The Get function is called the accessor (https://laravel. is retrieved, it will maintain its format. In this case, the
com/docs/9.x/eloquent-mutators#accessors-and-muta- get() accessor is redundant. I use it when I want to display
tors). You use this function to decide what the value of this the field in a different format than the one stored in the
column will look like when retrieved and accessed. database. For the sake of this demonstration, I use both
to let you know that both accessors and mutators exist
The Set function is called the mutator (https://laravel.com/ in Laravel.
docs/9.x/eloquent-mutators#accessors-and-mutators).
You can use this function to do your conversion logic before That was a brief overview of the models in Laravel MVC. You
Laravel saves this model. can see that the Eloquent ORM is big and powerful.
Conclusion
Laravel makes use of Carbon PHP Laravel not only supports MVC architecture, but it also
for dates everywhere in adds many productivity tools and concepts that make Web
development in Laravel a breeze!
the framework source code.
This is just the beginning of a detailed series of articles
covering Web development with PHP Laravel. Now that you
know the M of MVC, next time you will learn about the C and
In this case, you’re parsing the published_at field to a Car- V! Stay tuned to discover more with PHP Laravel.
bon instance and then formatting it as YYYY-MM-DD. Even-
tually, that’s how it will be stored in the database without Bilal Haidar
the Time factor of the Date field. Similarly, when the value
TIME TO
MODERNIZE YOUR
OLD SOFTWARE?
Is your business being held back by outdated software? We can help.
We specialize in updating legacy business applications to modern technologies.
CODE Consulting has top-tier developers available with in-depth experience in .NET,
web development, desktop development (WPF), Blazor, Azure, mobile apps, IoT and more.
Contact us today for a complimentary one hour tech consultation. No strings. No commitment. Just CODE.
codemag.com/modernize
832-717-4445 ext. 9 • info@codemag.com
Implementing
Face Recognition
Using Deep Learning and
Support Vector Machines
One of the most exciting features of artificial intelligence (AI) is undoubtedly face recognition.
Research in face recognition started as early as in the 1960s, when early pioneers in the field
measured the distances of the various “landmarks” of the face, such as eyes, mouth, and nose, and
50 Implementing Face Recognition Using Deep Learning and Support Vector Machines codemag.com
then computed the various distances in order to determine • First, a set of positive images (images of faces) and a
a person’s identity. The work of early researchers was ham- set of negative images (images with faces) are used to
pered by the limitations of the technology of the day. It train the classifier.
wasn’t until the late 1980s that we saw the potential of face • You then extract the features from the images. Figure
recognition as a business need. And today, due to the tech- 1 shows some of the features that are extracted from
nological advances in computing power, face recognition is images containing faces.
gaining popularity and can be performed easily, even from • To detect faces from an image, you look for the presence
mobile devices. of the various features that are usually found on human
faces (see Figure 2), such as the eyebrow, where the re-
How exactly does face recognition work, and how can you gion above the eyebrow is lighter than the region below it. Wei-Meng Lee
make use of it using a language that you already know? In • When an image contains a combination of all these weimenglee@learn2develop.net
this article, I’ll walk you through some applications that you features, it is deemed to contain a face. http://www.learn2develop.net
can build to perform face recognition. Most interesting of @weimenglee
all, you can use the applications that I’ll demonstrate to If you’re interested in a visualization of how a face is de-
Wei-Meng Lee is a tech-
recognize your own friends and family members. tected, check out the following videos on YouTube: nologist and founder of De-
veloper Learning Solutions
• https://www.youtube.com/watch?v=hPCTwxF0qf4&t (www.learn2develop.net),
(see Figure 3) a technology company
Buckle-up, and get ready • https://www.youtube.com/watch?v=F5rysk51txQ specializing in hands-on
for some real action! training on the latest
Fortunately, without needing to know how Haar cascades technologies. Wei-Meng
work, OpenCV can perform face detection out of the box us- has many years of training
ing a pre-trained Haar cascade, along with other Haar cas- experiences and his
cades for recognizing other objects. The list of predefined training courses place
Like all my articles, this article is heavily hands-on, so be Haar cascades is available on GitHub at https://github.com/ special emphasis
sure to buckle-up, and get ready for some real action! For opencv/opencv/tree/master/data/haarcascades. on the learning-by-doing
this article, I’m going to assume that you are familiar with approach. His hands-on
Python, and understand the basics of machine learning and For face detection, you’ll need the haarcascade_frontalfa- approach to learning
deep learning. If you need a refresher on these two topics, ce_default.xml file that you can download from the GitHub programming makes
be sure to refer to my earlier articles in CODE Magazine: link in the previous paragraph. understanding the subject
much easier than reading
books, tutorials, and
• Implementing Machine Learning Using Python and Detecting Faces Using Webcam
documentation. His name
Scikit-learn, CODE Magazine, November/Decem- Now that you have a basic idea of how face detection works using
regularly appears in online
ber2017. https://www.codemag.com/Article/1711091/ Haar cascades, let’s write a Python program to turn on a webcam
and print publications such
Implementing-Machine-Learning-Using-Python-and- and then try to detect the face in it. I’ll be using Anaconda. as DevX.com, MobiForge.
Scikit-learn com, and CODE Magazine.
• Introduction to Deep Learning, CODE Magazine
March/April 2020. https://www.codemag.com/Ar-
ticle/2003071/Introduction-to-Deep-Learning
Face Detection
Before I discuss face recognition, it’s important to discuss an- Figure 1: Edges in a Haar cascade that detects various
other related technique: face detection. As the name implies, features in an image
face detection is a technique that identifies human faces in a
digital image. Face detection is a relatively mature technol-
ogy—remember back in the good old days of your digital cam-
era when you looked through the viewfinder? You saw rect-
angles surrounding the faces of the people in the viewfinder.
codemag.com Implementing Face Recognition Using Deep Learning and Support Vector Machines 51
Figure 3: This video provides a visual approach to
understanding how Haar cascades work Figure 4: Detecting faces using the webcam
First, install OpenCV using the following command at the face recognition is a technique for recognizing faces in an
Anaconda Prompt (or Terminal if you are using a Mac): image. Compared to face detection, face recognition is a
much more complicated process and is an area of much in-
$ pip install opencv-python terest to researchers, who are always looking to improve the
accuracy of the recognition.
Next, create a file named face_detection.py and populate
it with the code shown in Listing 1. Then, download the In this article, I’ll discuss two techniques that you can gen-
haarcascade_frontalface_default.xml file and save it into erally use for face recognition:
the same directory as the face_detection.py file.
• Deep learning using convolutional neural networks (CNN)
To run the program, type the following command in Ana- • Machine learning using support vector machines (SVM)
conda Prompt:
$ python face_detection.py
Deep Learning—Convolutional
Neural Network (CNN)
Figure 4 shows that when the program detects a face, it In deep learning, a convolutional neural network (CNN) is
draws a rectangle around it. If it detects multiple faces, a special type of neural network that is designed to process
multiple rectangles are shown. data through multiple layers of arrays. A CNN is well-suited
for applications like image recognition and is often used in
face recognition software.
Techniques to Recognize Faces
Now that you’ve learned how to detect faces, you are ready In CNN, convolutional layers are the fundamental building
to tackle the bigger challenge of recognizing faces! Face blocks that make all the magic happen. In a typical image
detection is a technique for detecting faces in an image and recognition application, a convolutional layer is made up of
52 Implementing Face Recognition Using Deep Learning and Support Vector Machines codemag.com
several filters to detect the various features of the image. Un-
derstanding how this work is best illustrated with an analogy.
Using VGGFace for Face Recognition ResNet50 is a 50-layer Residual Network with 26M param-
VGGFace refers to a series of models developed for face recog- eters. This residual network is a deep convolutional neural
nition. It was developed by the Visual Geometry Group (hence network that was introduced by Microsoft in 2015.
its VGG name) at the University of Oxford. The models were
trained on a dataset comprised mainly of celebrities, public
figures, actors, and politicians. Their names were extracted
from the Internet Movie Data Base (IMDB) celebrity list based To know more about ResNet-50,
on their gender, popularity, pose, illumination, ethnicity, and go to https://viso.ai/deep-learning/
profession (actors, athletes, politicians). The images of these
names were fetched from Google Image Search, and multiple
resnet-residual-neural-network/.
images for each name were downloaded, vetted by humans,
and then labelled for training.
SENet is a smaller network developed by researchers at
There are two versions of VGGFace: DeepScale, University of California at Berkeley, and Stanford
University. The goal of SENet was to create a smaller neural
• VGGFace: Developed in 2015, trained on 2.6 million network that can easily fit into computer memory and be
images, a total of 2622 people easily transmitted over a computer network.
• VGGFace2: Developed in 2017, trained on 3.31 million
images, a total of 9131 people Let’s now try out how VGGFace works and see if it can ac-
curately recognize some of the faces that we throw at it. For
The original VGGFace uses the VGG16 model, which is a con- this, you’ll make use of the Keras’s implementation of VGG-
volutional neural network with 16 layers (see Figure 6). Face located at https://github.com/rcmalli/keras-vggface.
codemag.com Implementing Face Recognition Using Deep Learning and Support Vector Machines 53
For this example, I’ll use Jupyter Notebook. First, you need to Download a copy of his headshot (Matthias_Sammer.jpg)
install the keras_vggface and keras_applications modules: and put it in the same folder as your Jupyter Notebook.
The following code snippet will load the image, resize it to
!pip install keras_vggface 224x224 pixels, convert the image into a NumPy array, and
!pip install keras_applications then use the model to make the prediction:
import numpy as np
from keras.preprocessing import image
The keras_applications module from keras_vggface.vggface import VGGFace
from keras_vggface import utils
provides model definitions and
pre-trained weights for a number # load the image
of popular architectures, img = image.load_img(
‘./Matthias_Sammer.jpg’,
such as VGG16, ResNet50, Xception, target_size=(224, 224))
MobileNet, and more.
# prepare the image
x = image.img_to_array(img)
x = np.expand_dims(x, axis=0)
To use VGGFace (based on the VGG16 CNN model), you can x = utils.preprocess_input(x, version=1)
specify the vgg16 argument for the model parameter:
# perform prediction
from keras_vggface.vggface import VGGFace preds = model.predict(x)
print(‘Predicted:’,
model = VGGFace(model=’vgg16’) utils.decode_predictions(preds))
# same as the following
model = VGGFace() # vgg16 as default The result of the prediction is as follows:
To use VGGFace2 with the ResNet-50 model, you can specify Predicted:
the resnet50 argument: [[["b' Matthias_Sammer'", 0.9927065],
["b' Bj\\xc3\\xb6rn_Ferry'",
model = VGGFace(model=’resnet50’) 0.0011530316],
["b' Stone_Cold_Steve_Austin'",
To use VGGFace2 with the SENet model, specify the senet50 0.00084367086],
argument: ["b' Florent_Balmont'", 0.00058827153],
["b' Tim_Boetsch'", 0.0003584346]]]
model = VGGFace(model=’senet50’)
From the result, you can see that the probability of the im-
For the example here, I’m going to use the SENet model. age containing Matthias Sammer’s face is 0.9927065 (the
When you run the above code snippets, the weights for highest probability).
the trained model will be downloaded and stored in the
~/.keras/models/vggface folder. Here’s the size (in bytes) Using Transfer Learning to Recognize Custom Faces
of the weights downloaded for each model: The previous section showed how you can use VGGFace2
to recognize some of the pretrained faces. Although this
165439116 rcmalli_vggface_tf_resnet50.h5 is cool, it isn’t very exciting. A more interesting way to
175688524 rcmalli_vggface_tf_senet50.h5 use the VGGFace2 is to use it to recognize the faces that
580070376 rcmalli_vggface_tf_vgg16.h5 you want. For example let’s say that you want to use it to
build an attendance system to recognize students in a class.
As you can observe, the VGG16 weights is the largest at 580 To do that, you make use of a technique called transfer
MB and the ResNet50 is smallest at 165 MB. learning.
Figure 7: Image of Matthias You’re now ready to test the model and see if it could recog- Transfer learning is a machine learning method where a
Sammer (source: https:// nize a face that it was trained to recognize. The first face that model developed for a task is reused as the starting point
en.wikipedia.org/wiki/Matthias_ I want to try is Mattias Sammer (see Figure 7), a German for- for a model on a second task. Transfer learning reduces the
Sammer#/media/File:Matthias_ mer professional football player and coach who last worked as amount of time that you need to spend on training.
Sammer_2722.jpg) sporting director in Bayern Munich.
Recall that in general CNN, models for image classification
can be divided into two parts:
54 Implementing Face Recognition Using Deep Learning and Support Vector Machines codemag.com
Advertisement
WANT TO LIVE
ON MAUI?
IF YOU CAN WORK FROM HOME,
WHY NOT MAKE PARADISE YOUR HOME?
The world has changed. Millions of people are working from home, and for many, that will continue
way past the current crisis. Which begs the question: If you can work from home, then why not
make your home in one of the world’s premiere destinations and most desirable living areas?
The island of Maui in Hawai’i is not just a fun place to visit for a short vacation, but it is uniquely
situated as a place to live. It offers great infrastructure and a wide range of things to do, not to
mention a very high quality of life.
We have teamed up with CODE Magazine and Markus Egger to provide you information about
living in Maui. Markus has been calling Maui his home for quite some time, so he can share his own
experience of living in Maui and working from Maui in an industry that requires great infrastructure.
• Barack Obama
• Donald Trump
• Tom Cruise
Figure 9: Folders containing images of the people you want Once the images are prepared and saved in the respective
to recognize folders, you need to perform some preprocessing on the
images before you can use them for training. You need to
extract the faces from the images so that only the faces are
used for training. The steps are as follows:
Figure 11 shows the face detected in each image and the up-
dated images. It’s possible that in some images there will be
multiple faces detected. In the event that there is no face or
Figure 10: Some of the images in the folders when there are multiple faces detected, the image is discarded.
Figure 11: Detecting faces in the image and updating the images with the detected faces
56 Implementing Face Recognition Using Deep Learning and Support Vector Machines codemag.com
Listing 2: Preprocessing the images used for training
import cv2
import os # get the faces detected in the image
import pickle faces = \
import numpy as np facecascade.detectMultiScale(imgtest,
from PIL import Image scaleFactor=1.1, minNeighbors=5)
# set the directory containing the images # draw the face detected
images_dir = os.path.join(“.”, headshots_folder_name) face_detect = cv2.rectangle(imgtest,
(x_, y_),
current_id = 0 (x_+w, y_+h),
label_ids = {} (255, 0, 255), 2)
plt.imshow(face_detect)
# iterates through all the files in each plt.show()
# subdirectories
for root, _, files in os.walk(images_dir): # resize the detected face to 224x224
for file in files: size = (image_width, image_height)
if file.endswith(“png”) or
file.endswith(“jpg”) or # detected face region
file.endswith(“jpeg”): roi = image_array[y_: y_ + h,
# path of the image x_: x_ + w]
path = os.path.join(root, file)
# resize the detected head to
# get the label name (name of the person) # target size
label = os.path.basename(root).replace( resized_image = cv2.resize(roi, size)
“ “, “.”).lower() image_array = np.array(
resized_image, “uint8”)
# add the label (key) and its number
# (value) # remove the original image
if not label in label_ids: os.remove(path)
label_ids[label] = current_id
current_id += 1 # replace the image with only the face
im = Image.fromarray(image_array)
# load the image im.save(path)
imgtest = cv2.imread(path,
cv2.IMREAD_COLOR)
image_array = np.array(imgtest, “uint8”)
codemag.com Implementing Face Recognition Using Deep Learning and Support Vector Machines 57
Listing 3: The various layers in VGGFace16
Model: "vggface_vgg16" pool4 (MaxPooling2D) (None, 14, 14, 512) 0
_________________________________________________________________ _________________________________________________________________
Layer (type) Output Shape Param # conv5_1 (Conv2D) (None, 14, 14, 512) 2359808
================================================================= _________________________________________________________________
input_1 (InputLayer) [(None, 224, 224, 3)] 0 conv5_2 (Conv2D) (None, 14, 14, 512) 2359808
_________________________________________________________________ _________________________________________________________________
conv1_1 (Conv2D) (None, 224, 224, 64) 1792 conv5_3 (Conv2D) (None, 14, 14, 512) 2359808
_________________________________________________________________ _________________________________________________________________
conv1_2 (Conv2D) (None, 224, 224, 64) 36928 pool5 (MaxPooling2D) (None, 7, 7, 512) 0
_________________________________________________________________ _________________________________________________________________
pool1 (MaxPooling2D) (None, 112, 112, 64) 0 flatten (Flatten) (None, 25088) 0
_________________________________________________________________ _________________________________________________________________
conv2_1 (Conv2D) (None, 112, 112, 128) 73856 fc6 (Dense) (None, 4096) 102764544
_________________________________________________________________ _________________________________________________________________
conv2_2 (Conv2D) (None, 112, 112, 128) 147584 fc6/relu (Activation) (None, 4096) 0
_________________________________________________________________ _________________________________________________________________
pool2 (MaxPooling2D) (None, 56, 56, 128) 0 fc7 (Dense) (None, 4096) 16781312
_________________________________________________________________ _________________________________________________________________
conv3_1 (Conv2D) (None, 56, 56, 256) 295168 fc7/relu (Activation) (None, 4096) 0
_________________________________________________________________ _________________________________________________________________
conv3_2 (Conv2D) (None, 56, 56, 256) 590080 fc8 (Dense) (None, 2622) 10742334
_________________________________________________________________ _________________________________________________________________
conv3_3 (Conv2D) (None, 56, 56, 256) 590080 fc8/softmax (Activation) (None, 2622) 0
_________________________________________________________________ =================================================================
pool3 (MaxPooling2D) (None, 28, 28, 256) 0 Total params: 145,002,878
_________________________________________________________________ Trainable params: 145,002,878
conv4_1 (Conv2D) (None, 28, 28, 512) 1180160 Non-trainable params: 0
_________________________________________________________________ _________________________________________________________________
conv4_2 (Conv2D) (None, 28, 28, 512) 2359808 26
_________________________________________________________________
conv4_3 (Conv2D) (None, 28, 28, 512) 2359808
_________________________________________________________________
Building the Model Let’s now add the custom layers so that the model can rec-
Next, you’re ready to build the model. First, load the ognize the faces in your own training images:
VGGFace16 model:
x = base_model.output
from keras_vggface.vggface import VGGFace x = GlobalAveragePooling2D()(x)
58 Implementing Face Recognition Using Deep Learning and Support Vector Machines codemag.com
Listing 4: The VGGFace16 now includes the additional layers that you have added to it
Model: "model"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
...
Because the first 19 layers were already trained by the # returns a compiled model identical to
VGGFace16 model, you only need to train the new layers that # the previous one
you’ve added to the model. Essentially, the new layers that model = load_model(
you’ve added will be trained to recognize your own images. ‘transfer_learning_trained’ +
‘_face_cnn_model.h5’)
Compiling and Training the Model
You can now compile the model using the Adam optimizer Saving the Training Labels
and the categorical Entropy loss function: Using the ImageDataGenerator instance, you can generate a
mapping of the index corresponding to each person’s name:
model.compile(optimizer=’Adam’,
loss=’categorical_crossentropy’, import pickle
metrics=[‘accuracy’])
class_dictionary = \
Finally, train the model using the following arguments: train_generator.class_indices
class_dictionary = {
model.fit(train_generator, value:key for key, value in
batch_size = 1, class_dictionary.items()
verbose = 1, }
epochs = 20) print(class_dictionary)
Saving the Model The above code prints out the following:
Once the model is trained, it’s important to save it to disk
first. If not, you must train the model again every time you {
want to recognize a face: 0: ‘Barack Obama’,
1: ‘Donald Trump’,
# creates a HDF5 file 2: ‘Tom Cruise’
model.save( }
codemag.com Implementing Face Recognition Using Deep Learning and Support Vector Machines 59
Listing 5: Predicting the faces
# for detecting faces # draw the face detected
facecascade = \ face_detect = cv2.rectangle(
cv2.CascadeClassifier( imgtest, (x_, y_), (x_+w, y_+h), (255, 0, 255), 2)
‘haarcascade_frontalface_default.xml’) plt.imshow(face_detect)
plt.show()
for i in range(1,30):
test_image_filename = f’./facetest/face{i}.jpg’ # resize the detected face to 224x224
size = (image_width, image_height)
# load the image roi = image_array[y_: y_ + h, x_: x_ + w]
imgtest = cv2.imread(test_image_filename, resized_image = cv2.resize(roi, size)
cv2.IMREAD_COLOR)
image_array = np.array(imgtest, “uint8”) # prepare the image for prediction
x = image.img_to_array(resized_image)
# get the faces detected in the image x = np.expand_dims(x, axis=0)
faces = facecascade.detectMultiScale(imgtest, x = utils.preprocess_input(x, version=1)
scaleFactor=1.1, minNeighbors=5)
# making prediction
# if not exactly 1 face is detected, skip this photo predicted_prob = model.predict(x)
if len(faces) != 1: print(predicted_prob)
print(f’---We need exactly 1 face; print(predicted_prob[0].argmax())
photo skipped---’) print(“Predicted face: “ +
print() class_list[predicted_prob[0].argmax()])
continue print(“============================\n”)
This dictionary is needed so that later on when you perform a import pickle
prediction, you can use the result returned by the model (which is import numpy as np
an integer and not the person’s name) to get the person’s name. import pickle
Save the dictionary object using Pickle: from PIL import Image
import matplotlib.pyplot as plt
# save the class dictionary to pickle from keras.preprocessing import image
face_label_filename = ‘face-labels.pickle’ from keras_vggface import utils
with open(face_label_filename, ‘wb’) as f: pickle.dump(class_
dictionary, f) # dimension of images
image_width = 224
Testing the Trained Model image_height = 224
In the folder where you saved your Jupyter Notebook files,
create a folder named facetest and add samples of images # load the training labels
containing faces of the people you want to recognize. Fig- face_label_filename = ‘face-labels.pickle’
ure 12 shows some of the images in the folders. with open(face_label_filename, “rb”) as \
f: class_dictionary = pickle.load(f)
Import the modules and load the labels for the various faces:
class_list = [value for _, value in
import cv2 class_dictionary.items()]
import os print(class_list)
60 Implementing Face Recognition Using Deep Learning and Support Vector Machines codemag.com
The loaded face label is a dictionary containing the mapping Later on, after the prediction, you’ll make use of this list to
of integer values to the names of the people that you have obtain the name of the predicted face.
trained. The above code snippet converted that dictionary
into a list that looks like this: Listing 5 shows how to iterate through all the images in the
facetest folder and send the image to the model for prediction.
['Barack Obama', 'Donald Trump', 'Tom Cruise'] Figure 13 shows some of the results.
codemag.com Implementing Face Recognition Using Deep Learning and Support Vector Machines 61
Face Recognition Using Webcam Once the line is drawn to separate the classes, you can then
With the model trained to recognize faces belonging to use it to predict future data. For example, given the snout
Obama, Trump, and Cruise, it would be fun to be able to recog- length and ear geometry of a new unknown animal, you can
nize their faces using the webcam. Listing 6 shows how you now use the dividing line as a classifier to predict whether
can use the webcam to perform the prediction in real-time. the animal is a dog or a cat.
62 Implementing Face Recognition Using Deep Learning and Support Vector Machines codemag.com
Listing 8: Testing the model
import cv2 skipped---\n’)
import pickle continue
Importing the Packages Splitting the Dataset for Training and Testing
First, install the scikit-image module: Once the images are loaded into the dataframe, you need to
split the dataframe into a training and testing set:
!pip install scikit-image
x = df.iloc[:,:-1]
Then, import all the modules that you need: y = df.iloc[:,-1]
codemag.com Implementing Face Recognition Using Deep Learning and Support Vector Machines 63
Figure 17: Some of the positive results using the trained SVM model
# print the parameters for the best performing model Wei-Meng Lee
print(model.best_params_)
y_pred = model.predict(x_test)
print(f”The model is
{accuracy_score(y_pred,y_test)
* 100}% accurate”)
64 Implementing Face Recognition Using Deep Learning and Support Vector Machines codemag.com
Compare. Buy. Build.
Discover the best software components and tools
www.componentsource.com/compare
Prerequisites occurs when the data is not available in the cache. An ap-
If you’re to work with the code examples discussed in this plication can leverage the benefits of caching if there are
article, you need the following installed in your system: many more cache hits than cache misses.
• Microsoft.Extensions.Caching.Distributed.MemoryDis-
tributedCache
• Microsoft.Extensions.Caching.Redis.RedisCache
• Microsoft.Extensions.Caching.SqlServer.SqlServerCache
• Microsoft.Extensions.Caching.StackExchangeRedis.
RedisCache
The Redis service provided by Google Cloud Platform (GCP) Figure 2: The Web Server pushes relatively stale data to Redis Cache
is called Cloud Memorystore. Although you can export and
import Redis RDB data between your servers and GCP, native
backup options are not supported by Cloud Memorystore. Follow the steps outlined below to create a new Redis Cache
resource in Azure.
Set Up Redis Cache in Azure 1. Sign-in to the Azure portal. If you don’t have an ac-
Azure Cache for Redis is secure in-memory cache for data count, you can create one for free (the link is in the
storage and retrieval. It’s fully managed, and you can use Prerequisites section).
it to build high-performance applications that have scal- 2. Click Create a resource to create your Azure Redis re-
able architectures. You can take advantage of Redis Cache in source.
Azure to handle massive volumes of requests per second, as 3. Click on Databases and then select Azure Cache for
illustrated in Figure 1. Redis.
You can use it to build cloud or hybrid deployments to man- Figure 3 illustrates creating a new resource.
age enormous volumes of requests per second. In this sec-
tion, I’ll examine how to set up Redis Cache in Azure. Figure 1. In the New Redis Cache page, specify the subscription
2 shows a Web server retrieving data from the database and plan, the resource group (you can select an existing one
then pushing the data (usually relatively stale data is stored or select one from the dropdown list), the DNS name,
in the cache) to Redis Cache. your server location for using Redis, and the cache type.
Refer to Figure 4 to see the items from Step 4. newly created resource to specify the connection string. This
is needed by any application to connect to your Azure Redis
1. Next, choose the Network Connectivity to be used. Cache resource. Now follow the steps outlined below to con-
2. In the Advanced tab, select the Redis version to be used. nect to your Azure Redis Cache resource:
3. Finally click on Review + Create to create the Redis 1. On the home page of the Azure portal click on Re-
resource in Azure. source groups.
2. Once the Resource groups page is displayed, select the
Figure 5 illustrates specifying the configuration details. resource group that is associated with the Azure Redis
cache resource you’ve just created
Configure Redis Cache Connection String
Now that you’ve created your Azure Redis Cache resource, Figure 6 illustrates the resource group for your Redis Cache
the next step is to configure it. You should configure the resource.
1. Click on the Azure Cache for Redis instance. for HTTPS, Enable Docker Support, and the Enable
2. Select Access keys under settings and copy the primary OpenAPI support checkboxes are unchecked because
or secondary connection string from there you won’t use any of these in this example.
6. Click Create to complete the process.
Figure 7 shows you how to specify access keys.
You’ll use this application in the subsequent sections of this
In the next section, I’ll examine how to use this connection article.
string to connect to your Azure Redis Cache instance from
ASP.NET 6 applications. Install NuGet Package(s)
So far so good. The next step is to install the necessary NuGet
Package(s). To install the required packages into your project,
Programming Redis Cache in right-click on the solution and the select Manage NuGet Pack-
ASP.NET Core 6 ages for Solution.... Now search the two packages named Mi-
In this section, you’ll implement a simple application that crosoft.Extensions.Caching.StackExchangeRedis and StackEx-
takes advantage of the Redis cache in Azure to cache relatively change.redis in the search box and install these packages one
stale data. You’ll be using ASP.NET 6 in Visual Studio 2022 IDE. at a time. Alternatively, you can type the commands shown
below at the NuGet Package Manager Command Prompt:
Create a New ASP.NET 6 Project in Visual Studio 2022
Let’s start building the producer application first. You can PM> Install-Package Microsoft.Extensions.
create a project in Visual Studio 2022 in several ways. When Caching.StackExchangeRedis
you launch Visual Studio 2022, you’ll see the Start window. PM> Install-Package StackExchange.Redis
You can choose Continue without code to launch the main
screen of the Visual Studio 2022 IDE.
Configure the Redis Cache Instance
To create a new ASP.NET 6 Project in Visual Studio 2022: You can use the following code snippet to specify the Redis
connection string in the Program class.
1. Start the Visual Studio 2022 Preview IDE.
2. In the Create a new project window, select ASP.NET services.AddStackExchangeRedisCache(option =>
Core Web API, and click Next to move on. {
3. Specify the project name as AzureRedisCacheDemo and option.Configuration =
the path where it should be created in the Configure Configuration.GetConnectionString
your new project window. ("Your_RedisCache_Connection_String");
4. If you want the solution file and project to be created in option.InstanceName = “master”;
the same directory, you can optionally check the Place });
solution and project in the same directory checkbox.
Click Next to move on. Note how the AddStackExchangeRedisCache service is reg-
5. In the next screen, specify the target framework and istered, and the Configuration property assigned the Azure
authentication type as well. Ensure that the Configure Redis connection string.
SPONSORED SIDEBAR: and when the data residing in the cache will expire. There are Figure 8 illustrates how you can delete the resource group
two ways in which you can implement cache expiration: associated with your Redis Cache resource.
Get .NET 6 Help
for Free • Absolute Expiration: This denotes that maximum 1. Once prompted for confirmation, enter the name of the
time period to store data in the cache. Once this time resource group you’d like to delete.
How does a FREE hour- elapses, Redis deletes all keys and their correspond-
long CODE Consulting ing data. That’s all you need to do! Your resource group will be de-
virtual meeting with our
• Sliding Expiration: This denotes the maximum time leted in a few minutes.
expert .NET consultants
period to store a piece of data when the application is
sound? Yes, FREE. No
not consuming the data.
strings. No commitment. Where Should I Go from Here?
No credit cards. Nothing to
buy. For more information, You can write the following piece of code to implement Now that you’re aware of how to work with Redis Cache in
visit www.codemag.com/ cache expiration: Azure, you can take advantage of Application Insights in
consulting or email us at Azure to know the performance of your application over
info@codemag.com. var expiration = new time. This will help you to analyze the performance improve-
DistributedCacheEntryOptions{ ment you’d gain by leveraging Redis Cache in Azure. You
AbsoluteExpirationRelativeToNow = can also use a database in lieu of the in-memory data store
TimeSpan.FromSeconds(30), you’ve used in this example to store data permanently.
SlidingExpiration =
TimeSpan.FromSeconds(25)};
Conclusion
Redis is a powerful distributed caching engine that provides
Clean Up key-value pair caching with very low latency. Redis may
Now that you’re done using the resources in Azure, it’s high significantly improve application performance when used
time that you delete the resources you’ve used to avoid be- in the correct business context. Caching works better when
ing billed. Follow the steps outlined below to delete the re- the data changes infrequently, i.e., when the cached data
sources used in this example: doesn’t change often. Remember, caching is a feature that
helps speed up the performance, scalability, and respon-
1. Sign into the Azure portal. siveness of your application but your application should be
2. Select Resource groups. properly tested to never depend on cached data.
3. Enter the name of the resource group in the filter textbox.
4. When the resource group is listed in the results list, Joydip Kanjilal
select it, and click “Delete resource group”.
(Continued from 74) troller (MVC) Pattern and templating are familiar,
because Rick was showing us how to do that 25
who we lost a few years ago. We got into some years ago in FoxPro.
mammoth debates over things that today seem May/Jun 2022
trivial. But that spirit of community, like a fam-
ily, with all its dysfunction, was a truly wonderful Relationships Volume 23 Issue 3
thing. For me, the most significant personal relation- Group Publisher
ship and most meaningful professional relation- Markus Egger
It was in that community where members like ship was due to my relationship with FoxPro. Associate Publisher
Steve Black turned us all on to how Design Pat- My best friend is Rod Paddock with whom I’ve Rick Strahl
terns and the Gang of Four (GoF) book applied to worked on many projects, application-wise and Editor-in-Chief
our work. It was all new and exciting, especially book-wise, and also on this magazine for over 20 Rod Paddock
when 1995 arrived, as well as Visual FoxPro and years. We met in 1994 in Toronto at FoxTeach. He
Managing Editor
object orientation! How we fell in love with in- wrote some cool content in the FoxTalk newslet- Ellen Whitney
heritance! How we hated all the problems that ter, the same publication where I got my start in
created! The bigger point was that we were all professional writing. For the record, we didn’t get Contributing Editor
John V. Petersen
learning and sharing together. The FoxPro com- paid hard money for those articles. Instead, the
munity was truly exceptional and no other com- currency was coffee from this little Seattle-based Content Editor
Melanie Spiller
munity since then has ever come close to that coffee roaster called Starbucks. That’s right…
experience, at least not for me. Speaking of Steve even coffee has a FoxPro connection for me! It Editorial Contributors
Black, he also introduced us to the Wiki concept was at that conference were Rod and I struck up Otto Dobretsberger
Jim Duffy
that was first introduced by Ward Cunningham our friendship and he told me about a book proj- Jeff Etter
(Agile Manifesto, SOLID Programming). Check- ect he was on. That project became the book Vi- Mike Yeager
out fox.wikis.com. If there’s one phrase that sual FoxPro Enterprise Development. That project
Writers In This Issue
described the whole FoxPro experience, it’s the included me, Rod, Ron Talmage, and another guy Bilal Haidar Joydip Kanjilal
notion of “applied theory.” There’s theory, and named Eric Ranft. Eric went on to co-find a little Wei-Meng Lee Julie Lerman
then there’s the notion of applying theory in a e-signature company called DocuSign. Sahil Malik Jeremy Miller
Paul D. Sheriff Shawn Wildermuth
way that makes it useful. Ever since my FoxPro
days, that philosophy has permeated my thinking Although FoxPro as an active product is now his- Technical Reviewers
and writing. tory, its legacy is as relevant today as ever. I’m Markus Egger
Rod Paddock
convinced that .NET, in no small part, owes its
utility to that FoxPro acquisition. The people and Production
Frameworks ethos that were brought to bear on the Microsoft
Friedl Raffeiner Grafik Studio
www.frigraf.it
We rely on all sorts of frameworks today: Angular, ecosystem has paid big dividends for the devel- Graphic Layout
React, etc. And as previously mentioned, what opment world at large. And if you need any more Friedl Raffeiner Grafik Studio in collaboration
we used to call public domain software is open reminding of that fact, you’re reading this maga- with onsight (www.onsightdesign.info)
source today. It was through FoxPro that I was zine, right? Printing
introduced to the first, serious way to structure Fry Communications, Inc.
applications. This tied together design patterns, John V. Petersen 800 West Church Rd.
Mechanicsburg, PA 17055
libraries, and other approaches to building tools.
Advertising Sales
One of the big names in FoxPro history is a guy Tammy Ferguson
832-717-4445 ext 26
named Yair Alan Griver (YAG). Once upon a time, tammy@codemag.com
he had a little shop in River Edge, NJ called Flash
Creative Management and he created a thing Circulation & Distribution
General Circulation: EPS Software Corp.
called the Codebook. It was a somewhat opinion- Newsstand: American News Company (ANC)
ated way of documenting and application and
applying conventions. My framework of choice Subscriptions
Subscription Manager
was something called FoxExpress by Mike and Colleen Cade
Toni Feltman (Fox Software alums!!). The point ccade@codemag.com
is that through the community, we were all in it
US subscriptions are US $29.99 for one year. Subscriptions
together, learning and teaching each other. outside the US are US $50.99. Payments should be made
in US dollars drawn on a US bank. American Express,
Another great framework was WestWind Web Con- MasterCard, Visa, and Discover credit cards accepted.
Bill me option is available only for US subscriptions.
nection by Rick Strahl (co-founder of CODE Maga- Back issues are available. For subscription information,
zine). I remember, way back in the mid-1990s, e-mail subscriptions@codemag.com.
Rick showing us how we could build Web applica-
Subscribe online at
tions with FoxPro. If you went to fox.wikis.com, www.codemag.com
take note of the wc.dll in the URL when you navi-
gate to a page. WC stands for Web Connect. Yes, CODE Developer Magazine
6605 Cypresswood Drive, Ste 425, Spring, Texas 77379
Rick still maintains that framework, along with Phone: 832-717-4445
producing what I still regard as the best scholar-
ship and work in Web development today. Before
there was ASP, Ruby on Rails, or Node.js, there
was West Wind Web Connect. Today the basic pat-
terns we employ, such as the Model-View-Con-
less remarkable than anything else. As an all-up Pro developer, even just reading this magazine, owned Fox Software, had the foresight to patent
environment that included an integrated data en- you’ve been touched by the Fox! If you’re super what is, simply stated, a very optimized approach
gine and SQL to complement its language, FoxPro interested in a more detailed history, go to fox- to indexing data and this yielded fast query re-
was second to none. But there was something prohistory.org. Yes, there is such a site. I cite the sults. That was FoxPro’s chief stock-in-trade. And
more to it. incontrovertible maxim of Rule 34: The clean ver- that’s why FoxPro veterans of that era are keenly
sion is that there’s a SIG (Special Interest Group) adept at dealing with large amounts of data ef-
In these pages, I often write about people, pro- for everything <gd&r>. fectively and efficiently. It’s that IP that MS was
cesses, and tools, in that order. For me, what was interested in and it’s that IP that found its way
perhaps more important than the FoxPro tool was Let’s visit some things that owe their existence to into many of MS’s future initiatives.
the FoxPro people. We often hear the word com- or have been greatly impacted by FoxPro.
munity in the context of social media. But once When it came to leveraging all Fox had with re-
upon a time, before Facebook, blogs, etc., there gard to data, I have to tip my hat to three former
was CompuServe. And in this context, there were The MVP Program co-workers: George Goley, Melissa Dunn, and Dr.
the CompuServe FoxPro forums. It was in those Once upon a time, Microsoft’s MVP program was Michael Brachman. We all worked at Microen-
forum spaces where I was exposed to and learned known as “Calvin’s list.” Calvin Hsia was one of deavors (MEI), just outside Philadelphia. Yes, we
the importance of community. Those forums, the lead Fox developers. He kept a list of Com- were the best FoxPro shop in the world, hands
among other things, were the Stack Overflow of puServe members who were most helpful on the down!! Another alum of that shop and somebody
their time. Whether it was a question about the FoxPro forums. A few folks that were helpful to very famous in FoxPro history was the late Drew
product, how to optimize a query, or something me and many others were people like Pat Adams, Speedie, with whom I had the pleasure of work-
more complex, such as application design, you’d Lisa Slater, and Tom Rettig, to name just a few. ing and learning much from. And when it came
surely get answers to your question. And, quite Although I’ll touch upon community later, the time to work with SQL Server-based data in an
likely, you’d receive several answers, often in the magic that was lightning in a bottle were these approachable way, there was no better person to
context of spirted but friendly debate. “elder states people” who were always there to explain that than Robert Green. Robert was the
lend a helping hand. There was one basic rule: FoxPro product manager for many years, and then
It was often good to reflect on the road traveled Pay it forward. That’s certainly what I’ve endeav- eventually moved on to helping .NET become the
and that’s what I’d like to do in this issue. Al- ored to do, following in their footsteps. Eventu- fantastic framework it has been for 20 years.
though FoxPro is no longer an active product, its ally, the MVP program started around 1994-5 and
spirit is alive and well, due in no small part to its it was around that time I joined those ranks. By
legacy and its people. It’s an anniversary deserv- then, other MS-related technologies were part of Community
ing of celebration and reflection. that MVP program. At that time, there may have There are many communities today, thanks to
been around 600 MVPs world-wide. And it had all social media platforms like Facebook and the
It’s also worth noting that this year, .NET turns started with Calvin’s list. ready availability of broadband. But once upon
20! Yes, there’s a FoxPro connection there too. a time, when we were limited to 2400 or 4800
It was at the 1993 FoxPro Devcon in Orlando baud modem connectivity via a US Robotics Mo-
Florida. The keynote that year was given by Roger Data dem, we had something called CompuServe. My
Heinen, the Microsoft Developer Tools VP. In that Remember ActiveX Data Objects? Predating ADO CompuServe ID was 72722,1243. Why I remember
talk, Roger spoke of the “Unified Language Strat- was ODBC (Open Database Connectivity). In the that I don’t know. Nevertheless, we had a great
egy.” That strategy eventually led to what, nine ODBC days, we had drivers. Eventually, it all led online community where we virtually hung out,
years later, became .NET. to the Entity Framework and other Object Rela- debated, and most importantly, helped each oth-
tional Mapping (ORM) libraries. Going back to er. Debates could be very spirited!
It was at that Devcon that I remember seeing two ADO, we had a similar concept called providers.
whiz kids. One was Ken Levy and his cool tool The most compliant provider was the Jet Engine, Eventually, CompuServe gave way to something
GenscreenX, a screen generator pre- and post- which was part of Access. ADO dealt with a client- known as the UniversalThread. The UT, as it was
processing tool. FoxPro was always “open” in its side notion of data known as a CURSOR (CURrent known, was another great place to discuss, de-
architecture. The screen and report generators Set Of Records). bate, and help each other. It was all organic,
were written in FoxPro! The extensions to those something that just existed. It wasn’t created or
facilities were referred to as public domain at One of Fox’s strengths was the notion of an in- conjured. It just happened. And ever since, there
the time. Today, we refer to such things as open tegrated database engine. The real magic was have been many attempts to recreate that magic.
source. when ANSI SQL was added to the Fox language, When I think of those days, a few names come
which itself was a variant of XBase, like Clipper to mind. I fondly remember the late John Koziol,
The other whiz kid is the publisher of this maga- and Dbase. FoxPro’s magic sauce was branded
zine, Markus Egger. Even if you weren’t a Fox- as Rushmore Technology. Dr. Dave Fulton, who (Continued on page 73)
TAKE
AN HOUR
ON US!
Does your team lack the technical knowledge or the resources to start new software development projects,
or keep existing projects moving forward? CODE Consulting has top-tier developers available to fill in
the technical skills and manpower gaps to make your projects successful. With in-depth experience in .NET,
.NET Core, web development, Azure, custom apps for iOS and Android and more, CODE Consulting can
get your software project back on track.
Contact us today for a free 1-hour consultation to see how we can help you succeed.
codemag.com/OneHourConsulting
832-717-4445 ext. 9 • info@codemag.com
shutters
tock/Lu
cky-pho
tograp
her
NEED
MORE OF THIS?
Is slow outdated software stealing way too much of your free time? We can help.
We specialize in updating legacy business applications to modern technologies.
CODE Consulting has top-tier developers available with in-depth experience in .NET,
web development, desktop development (WPF), Blazor, Azure, mobile apps, IoT and more.
Contact us today for a complimentary one hour tech consultation. No strings. No commitment. Just CODE.
codemag.com/modernize
832-717-4445 ext. 9 • info@codemag.com