I am honored, humbled, and elated to report that for the first time in my career I’ve been recognized with a Microsoft MVP Award!

According to the email I received there are only a few thousand MVPs worldwide so it is a pretty big deal for me to be part of this group. While I haven’t been told any specifics, my understanding is that factors that qualify people for this award include activity within the technical community such as contributions to open source projects, answering questions in technology forums, technical writing/blogging, and public speaking at developer conferences are the kind of things one can do to be recognized for this award.  I’ve been an active open source developer since 2004 when I founded the mojoPortal project, and over the years I’ve answered thousands of questions in the mojoPortal forums, but that project is built on older web technology and is not as popular these days as it once was. More recently I’ve founded a new set of projects collectively branded as “cloudscribe” and built on the latest greatest ASP.NET Core stack. I’ve spent the last year immersing myself in this new technology stack while it was in preview and I’ve accumulated about 4500 reputation points on stackoverflow in the past year mostly answering questions related to ASP.NET Core. I’m really excited about ASP.NET Core because it really makes it possible and natural to use design patterns that were difficult or impossible to use in the old WebForms framework and it gives us a truly modern web framework that embraces the web.  The old WebForms framework was really trying to make web development more like desktop application development and it did so by hiding the web in such a way that one could build web applications and sites without really understanding the underlying web technology. Looking back I guess that was a good thing for the early days of the web, but after building web applications for many years you end up learning the web technology anyway and you begin to realize that the old framework was making it harder for you to work directly with the nature of the web because of all the abstractions that have been layered on top if it in the framework. By contrast the new framework embraces the nature of the web and requires you to understand it directly and work with it directly and ultimately that is a good thing.

 

One of the great things about doing open source development and sharing your work is that online communities are global and you make friends worldwide. I have been truly blessed to get to know some really nice people in far flung corners of the world that I would never have met otherwise. In fact I would like to take this moment to offer sincere thanks and gratitude to my friend Guruprasad Balaji, a long time mojoPortal community member from Chennai India who nominated me for this MVP Award. Who knows maybe someday I will get an opportunity to travel to India and other places in the world and get to meet some of my online friends in person.

One of the benefits of the MVP Award that I’m most excited about is that once a year in November, Microsoft holds an MVP Summit and MVPs from all over the world fly in to Microsoft Headquarters in Redmond Washington!  For a few days I will get to attend some insider technology sessions and meet and network with other MVPs! I will get to meet in person with some really smart people I’ve admired online for many years!

Now that I’m a part of this MVP community I intend to do my best to be included for many years to come. To earn my keep, I will of course keep doing open source development and helping people on stackoverflow, but I will also try to be more active in public speaking.  On October 25, 2016 I’m scheduled to give a presentation on ASP.NET Core MVC framework at the Enterprise Developers Guild in Charlotte, NC. I’m very excited about this presentation and I plan to use my latest open source projects to illustrate the concepts I will be presenting with real working code examples. I hope you can attend the event, but even if not, I hope you will take a look at my new open source projects related to ASP.NET Core:

  • cloudscribe Core – a multi-tenant web application foundation providing management of sites (tenants), users, roles, and role membership. Pretty much every web projects needs that kind of thing so this is a foundation to build other things on top of so you don’t have to re-implement that stuff for every new project.
  • cloudscribe Simple Content – a simple yet flexible blog and content engine that can work with or without a database (actually at the time of this writing it only works without a database using my NoDb storage project, but soon I will implement Entity Framework data storage and possibly MongoDb at some point)
  • NoDb – a “no database” file system storage because many projects don’t need an enterprise database. Think of NoDb like a file system document database that stores objects in the file system serialized as json (or in some cases xml). Great for brochure web sites, and great for prototyping new applications.
  • cloudscribe Navigation – MVC components for navigation menus and breadcrumbs, I use this in cloudscribe Core and in cloudscribe SimpleContent but  it is useful for any MVC web project
  • cloudscribe Pagination – MVC TagHelpers for rendering pagination links for multi-page content
  • cloudscribe Syndication – a re-useable RSS feed generator for ASP.NET Core, I use this for the RSS feed in cloudscribe Simple Content but it could be used in other projects easily
  • cloudscribe MetaWeblog – a re-useable implementation of the metaweblog api for ASP.NET Core, I use this in cloudscribe Simple Content to make it possible to author content using Open Live Writer
  • cloudscribe Logging - an implementation of ILogger and ILoggerProvider that logs to the database using a pluggable model supporting multiple data platforms. Also provides an MVC controller for viewing and managing the log data
  • cloudscribe SimpleAuth – simple no-database required user authentication for web applications where only a few people need to be able to login

I’m a little late in blogging about my MVP Award, I found out in early July, but I really wanted to be able to blog about it using my new Simple Content project. Previously this site was running on a really old version of mojoPortal, but I just rebuilt this site using my new Simple Content project for the blog. It took a while to get everything ready because this is the first real world project I’ve done using Simple Content. I fixed a few bugs and implemented a few little improvements as part of getting this site completed, but there is still more to work to complete all the features I’ve planned. This site is using NoDb for storage which gives me an opportunity to prove the viability of building sites without a database. I’ve never really considered myself a web designer, I am primarily a web developer, but using bootstrap makes it easy to put together a professional looking, mobile responsive site. I’m no visual artist but I’m a decent mechanic when it comes to CSS. This is kind of a quick first draft design, I may yet change the color scheme and get a little more creative with it, but I think it is a major improvement over the outdated design of my old web site.

When building cross platform web applications on ASP.NET Core, one may think that since the framework itself is cross platform then your application will just work on any supported platform without having to do anything special.  That is mostly true but there are still some things that one has to consider during development to avoid some problems that can happen due to platform differences. For example file systems on linux are case sensitive and therefore so are urls, so you generally want to standardize on lower case urls. Also if building strings with file system paths, you need to always use Path.Combine or concatenate with Path.DirectorySeparatorChar rather than a back slash since that is Windows specific and forward slash is used on linux/mac.  Additionally Time Zone Ids are different on Windows than on linux/mac which uses IANA time zone ids, and if you try to use the TimeZoneInfo class and you pass in an invalid time zone id for the current platform it results in exceptions. If you are storing time zone ids for users or sites in a database and you migrate the site to a different platform, all of the stored time zone ids could be invalid for the new platform. I ran into this problem in developing my cloudscribe project, because it supports the concept of a site level time zone that is used by default, but an authenticated user may choose their own time zone. The approach I use is to always store dates in the database as UTC and then they can be adjusted to the given time zone for display as needed.

To solve the problem, I turned to the NodaTime project, which provides a comprehensive datetime library that is arguably superior to the DateTime and TimeZoneInfo classes provided in the .NET framework. For applications and features that are focused around dates, such as calendaring and scheduling, I think I would go all in on using the NodaTime classes instead of the built in framework classes, but for more common scenarios it makes sense to me to stick with the standard DateTime class for entity properties, since that will typically map directly to a corresponding database datetime on various database platforms without any friction. So for my purposes, what I wanted was to use NodaTime as a means to standardize on IANA time zone ids, and as a tool to convert back and forth from UTC datetime to various time zones as needed in a way that will work the same on any platform. NodaTime has a built in list of the IANA time zones, and it has the needed functionality for doing the conversions to and from UTC. While I am continuing to use the standard DateTime class for all my date properties on entities, I am no longer using the standard TimeZoneInfo class that is built into the .NET framework for converting dates back and forth, but instead using a little TimeZoneHelper class that I implemented for encapsulating the conversions so that none of my other code needs to know about NodaTime. My TimeZoneHelper also exposes the list of TimeZoneIds provided by NodaTime so I can use it to populate a dropdown list for time zone selection, and I simply store the IANA time zone id in the database for my sites and users.

This TimeZoneHelper class is not very large or complex and I think it would be useful for anyone else who wants a consistent way to handle TimeZones in a platform neutral way. Feel free to use this code in your own projects!

using Microsoft.Extensions.Logging;
using NodaTime;
using NodaTime.TimeZones;
using System;
using System.Collections.Generic;

namespace cloudscribe.Web.Common
{
    public class TimeZoneHelper : ITimeZoneHelper
    {
        public TimeZoneHelper(
            IDateTimeZoneProvider timeZoneProvider,
            ILogger<TimeZoneHelper> logger = null
            )
        {
            tzSource = timeZoneProvider;
            log = logger;
        }

        private IDateTimeZoneProvider tzSource;
        private ILogger log;

        public DateTime ConvertToLocalTime(DateTime utcDateTime, string timeZoneId)
        {
            DateTime dUtc;
            switch(utcDateTime.Kind)
            {
                case DateTimeKind.Utc:
                dUtc = utcDateTime;
                    break;
                case DateTimeKind.Local:
                    dUtc = utcDateTime.ToUniversalTime();
                    break;
                default: //DateTimeKind.Unspecified
                    dUtc = DateTime.SpecifyKind(utcDateTime, DateTimeKind.Utc);
                    break;
            }

            var timeZone = tzSource.GetZoneOrNull(timeZoneId);
            if (timeZone == null)
            {
                if(log != null)
                {
                    log.LogWarning("failed to find timezone for " + timeZoneId);
                }
               
                return utcDateTime;
            }

            var instant = Instant.FromDateTimeUtc(dUtc);
            var zoned = new ZonedDateTime(instant, timeZone);
            return new DateTime(
                zoned.Year,
                zoned.Month,
                zoned.Day,
                zoned.Hour,
                zoned.Minute,
                zoned.Second,
                zoned.Millisecond,
                DateTimeKind.Unspecified);
        }

        public DateTime ConvertToUtc(
            DateTime localDateTime,
            string timeZoneId,
            ZoneLocalMappingResolver resolver = null
            )
        {
            if (localDateTime.Kind == DateTimeKind.Utc) return localDateTime;

            if (resolver == null) resolver = Resolvers.LenientResolver;
            var timeZone = tzSource.GetZoneOrNull(timeZoneId);
            if (timeZone == null)
            {
                if (log != null)
                {
                    log.LogWarning("failed to find timezone for " + timeZoneId);
                }
                return localDateTime;
            }

            var local = LocalDateTime.FromDateTime(localDateTime);
            var zoned = timeZone.ResolveLocal(local, resolver);
            return zoned.ToDateTimeUtc();
        }

        public IReadOnlyCollection<string> GetTimeZoneList()
        {
            return tzSource.Ids;
        }
    }
}

You'll notice that this helper has some constructor dependencies, you can wire those up to be injected in the ConfigureServices method of your Startup class like this:

services.TryAddSingleton<IDateTimeZoneProvider>(new DateTimeZoneCache(TzdbDateTimeZoneSource.Default));
services.TryAddScoped<ITimeZoneHelper, TimeZoneHelper>();

Note that I made an interface ITimeZoneHelper, that this class implements, which would allow me to plugin different logic if I ever need or want to but you could simply remove the interface declaration if you use this in your own project. I hope you find this code useful!

If you are porting web applications to ASP.NET Core or building new web applications, you may notice that the System.Net.Mail namespace is not implemented in .NET Core. My understanding is that there are intentions to implement that namespace later, but it may be a little stumbling block for early adopters, since sending email is a very fundamental thing that most web applications need to do for common tasks such as verifying an email address for a new account or facilitating password reset.

There are some examples out there for sending email with various 3rd party services such as SendGrid, MailGun, Elastic Email, and the like, by using REST APIs to send the mail instead of using SMTP, and that is certainly a good option to consider. But for those who already have an SMTP server that they want to use, a better solution is needed. The good news is a better solution already exists and it works currently with RC1 of ASP.NET Core and will most surely also be available for RC2 and later releases.

There are actually 2 related projects that you should know about, MailKit and MimeKit. The goal of the MailKit project is “to provide the .NET world with robust, fully featured and RFC-compliant SMTP, POP3, and IMAP client implementations”, and indeed it meets that goal and in many ways is actually more powerful and flexible than the traditional System.Net.Mail components.

Here I will show some working example code that comes from my cloudscribe Core project. More specifically this example code is from my EmailSender class which is a work in progress, but functional enough to be a good example.

Lets start with a simple class to represent and encapsulate the settings we need to connect and authenticate with an SMTP Server:

    public class SmtpOptions
    {
        public string Server { get; set; } = string.Empty;
        public int Port { get; set; } = 25;
        public string User { get; set; } = string.Empty;
        public string Password { get; set; } = string.Empty;
        public bool UseSsl { get; set; } = false;
        public bool RequiresAuthentication { get; set; } = false;
        public string PreferredEncoding { get; set; } = string.Empty;
    }

You would new up one of these and set the properties according to your SMTP server configuration, then you would pass that in as one of the parameters to an EmailSender class that looks something like this:

using MailKit.Net.Smtp;
using MimeKit;
using System;
using System.Threading.Tasks;

namespace cloudscribe.Messaging.Email
{

    public class EmailSender
    {
        public EmailSender()
        {
        }

        public async Task SendEmailAsync(
            SmtpOptions smtpOptions,
            string to,
            string from,
            string subject,
            string plainTextMessage,
            string htmlMessage,
            string replyTo = null)
        {
            if (string.IsNullOrWhiteSpace(to))
            {
                throw new ArgumentException("no to address provided");
            }

            if (string.IsNullOrWhiteSpace(from))
            {
                throw new ArgumentException("no from address provided");
            }

            if (string.IsNullOrWhiteSpace(subject))
            {
                throw new ArgumentException("no subject provided");
            }

            var hasPlainText = !string.IsNullOrWhiteSpace(plainTextMessage);
            var hasHtml = !string.IsNullOrWhiteSpace(htmlMessage);
            if (!hasPlainText && !hasHtml)
            {
                throw new ArgumentException("no message provided");
            }

            var m = new MimeMessage();
          
            m.From.Add(new MailboxAddress("", from));
            if(!string.IsNullOrWhiteSpace(replyTo))
            {
                m.ReplyTo.Add(new MailboxAddress("", replyTo));
            }
            m.To.Add(new MailboxAddress("", to));
            m.Subject = subject;

            //m.Importance = MessageImportance.Normal;
            //Header h = new Header(HeaderId.Precedence, "Bulk");
            //m.Headers.Add()

            BodyBuilder bodyBuilder = new BodyBuilder();
            if(hasPlainText)
            {
                bodyBuilder.TextBody = plainTextMessage;
            }

            if (hasHtml)
            {
                bodyBuilder.HtmlBody = htmlMessage;
            }

            m.Body = bodyBuilder.ToMessageBody();
           
            using (var client = new SmtpClient())
            {
                await client.ConnectAsync(
                    smtpOptions.Server,
                    smtpOptions.Port,
                    smtpOptions.UseSsl)
                    .ConfigureAwait(false);
               
                // Note: since we don't have an OAuth2 token, disable
                // the XOAUTH2 authentication mechanism.
                client.AuthenticationMechanisms.Remove("XOAUTH2");

                // Note: only needed if the SMTP server requires authentication
                if(smtpOptions.RequiresAuthentication)
                {
                    await client.AuthenticateAsync(smtpOptions.User, smtpOptions.Password)
                        .ConfigureAwait(false);
                }
               
                await client.SendAsync(m).ConfigureAwait(false);
                await client.DisconnectAsync(true).ConfigureAwait(false);
            }

        }

        public async Task SendMultipleEmailAsync(
            SmtpOptions smtpOptions,
            string toCsv,
            string from,
            string subject,
            string plainTextMessage,
            string htmlMessage)
        {
            if (string.IsNullOrWhiteSpace(toCsv))
            {
                throw new ArgumentException("no to addresses provided");
            }

            if (string.IsNullOrWhiteSpace(from))
            {
                throw new ArgumentException("no from address provided");
            }

            if (string.IsNullOrWhiteSpace(subject))
            {
                throw new ArgumentException("no subject provided");
            }

            var hasPlainText = !string.IsNullOrWhiteSpace(plainTextMessage);
            var hasHtml = !string.IsNullOrWhiteSpace(htmlMessage);
            if (!hasPlainText && !hasHtml)
            {
                throw new ArgumentException("no message provided");
            }

            var m = new MimeMessage();
            m.From.Add(new MailboxAddress("", from));
            string[] adrs = toCsv.Split(',');

            foreach (string item in adrs)
            {
                if (!string.IsNullOrEmpty(item)) { m.To.Add(new MailboxAddress("", item)); ; }
            }

            m.Subject = subject;
            m.Importance = MessageImportance.High;
          
            BodyBuilder bodyBuilder = new BodyBuilder();
            if (hasPlainText)
            {
                bodyBuilder.TextBody = plainTextMessage;
            }

            if (hasHtml)
            {
                bodyBuilder.HtmlBody = htmlMessage;
            }

            m.Body = bodyBuilder.ToMessageBody();

            using (var client = new SmtpClient())
            {
                await client.ConnectAsync(
                    smtpOptions.Server,
                    smtpOptions.Port,
                    smtpOptions.UseSsl).ConfigureAwait(false);
               
                // Note: since we don't have an OAuth2 token, disable
                // the XOAUTH2 authentication mechanism.
                client.AuthenticationMechanisms.Remove("XOAUTH2");

                // Note: only needed if the SMTP server requires authentication
                if (smtpOptions.RequiresAuthentication)
                {
                    await client.AuthenticateAsync(
                        smtpOptions.User,
                        smtpOptions.Password).ConfigureAwait(false);
                }

                await client.SendAsync(m).ConfigureAwait(false);
                await client.DisconnectAsync(true).ConfigureAwait(false);
            }

        }

    }
}

Note that I’ve implemented 2 methods here, one that sends email to a single recipient and one that takes a comma separated list of recipients. This code could probably be refactored a bit to reduce duplication, I actually plan to implement more overloads for handling things like attachments. This is just an initial working stub that I plan to evolve as I encounter more varied needs in my project. Note that you can pass in either or both an html formatted message or plain text, but you must of course at least pass in one of them. I’ve left a few comments in the code to show how things like message importance can be set, but really I’ve only scratched the surface of what MailKit/MimeKit can do for you, so I encourage you to explore the available api surface of those projects.

Feel free to borrow this code and use it in your own projects, and I hope you will also take a look at my various open source projects on github that may be of use or value to you on your projects.

Happy Coding!!!

Dear Harris Teeter,

I really did not want to have to write this post, but I told you I would and I’m a man of my word.

First let me say that I like Harris Teeter in general, you have good produce and good meats, staff is generally very nice and helpful, prices are reasonable. You were born here in North Carolina and I like to support businesses from my home state. If it were not for your good qualities I would not bother with this post I would just stop shopping there, but I like Harris Teeter enough to try and make it better.

Therefore I’m going to try to get you to change the one really bad thing you do that makes me very uncomfortable when shopping there (and I assume it makes some other people feel uncomfortable too). If I cannot get you to change after trying my best then I will have no choice as a man of principle than to stop shopping there.

So What’s My Beef?

I’m really sick of hearing “security scan cameras”, “security scan and record all cameras”, “security scan cameras E and F”, and all the other variants that I’ve been hearing for many years while shopping at your store.

I have complained in the past at my local store and I have emailed your headquarters and tweeted at you about this in hopes that you would do the right thing. I told you that 140 characters on twitter is not enough to express why this practice is wrong and must be stopped and that I would write a blog post to shame you into changing if needed. I waited, but you have not done the right thing, so here we are.

The thing is, this voice that announces variations of “security scan cameras” is not a robot voice, it is one of the staff. When I first heard it I thought hmm, must be someone suspicious in the store. But for someone who shops frequently, when you tend to hear that about 80-90% of the times you shop there, when you hear it soon after entering the store, you start thinking maybe it is me they think is suspicious. And that makes me feel uncomfortable. And the more I think about it the more it bothers me. Even if it isn’t me, who is it and what makes them suspicious? The way they are dressed, the color of their skin, etc. Being a person who has empathy for my fellow humans this bothers me and I cannot support a business that does this.

Now when I have complained I have been told those announcements are random. But you know what, having a policy to randomly do that also gives cover to do it for other reasons so while I would like to believe you are not profiling people, I really cannot rule that out, and that bothers me.  If it really is random, does it make sense to randomly make customers feel like suspects? Business marketing 101 says no!

It does not make sense to make customers feel uncomfortable while shopping, and it does not make sense for me to keep shopping at a place that persists in a practice that makes me uncomfortable.

So I started thinking about why you have this policy, you must think it is beneficial, you have made it clear it is part of “loss prevention” strategy. I think the reason this has not backfired on you until now is because the average clean cut white person who hears that is not going to think they are the suspicious one in the store. Some people might say you are using “white priviledge” as part of the psychology for your loss prevention scheme. Here in the South, you might could get away with that for years in the past but the time has come for that to stop. For the record, I’m a white person but I am not immune to negative stereotype profiling because I have long hair. Some conservative folks frown on that sort of thing and make negative assumptions about you based on that. But there are many superficial things that people get profiled on, skin color, tattoos, skin piercing etc, etc, so even white people may not feel the white privilege in every case. I applaud the unique individuals who have the courage to be who they are and express themselves through personal style in spite of knowing how some people will judge them. I have long felt the societal pressure to conform, and even today if I were not self employed I think I would have a hard time getting some jobs for which I am very qualified without cutting my hair. I “could” cut my hair and perhaps then I could go back to assuming those announcements are not targeted at me, but that would be giving up a little piece of my self dignity. For people of color, my empathy says they probably also feel uncomfortable when they hear those announcements in your store.

I don’t want to stop shopping there without first doing my best to make you change this practice. So I plan to tweet a link to this post and my dismay every time I hear that when I shop at your store. If after 12 months of doing that you won’t change then I will stop shopping there and I will also encourage everyone I know to boycott your store.

The bottom line is I will not tolerate this treatment towards me, and I will not tolerate this treatment towards others indefinitely. I implore you to do the right thing and change this policy.

What Do I Suggest?

Look, I can hear the devil’s advocate arguments in my head saying “but loss prevention is important, if our losses go up we have to charge you more and you don’t want that right?” My answer to that is, I don’t hear this in other stores, and I would rather pay for my groceries with money than pay with my dignity.

What I suggest is make a recording and play it as often as you wish, but the recording should not imply that there is someone suspicious in the store. A good message alternative that I can think of is:

“For your safety this store is monitored by full coverage video surveillance.” That still sends a message to potential would be shop lifters to make them think twice, and it puts a positive spin on the surveillance policy. People expect to be under surveillance while shopping so this kind of message is acceptable. But a message that makes customers feel like suspects is simply not acceptable and it must stop for the long term benefit of the Harris Teeter brand. Your new parent company Kroger does not make customers feel like suspects, so maybe you need your parent to have a talk with you.

I Invite help from anyone who reads this and who agrees with me to join me in the effort to make Harris Teeter do the right thing. If you shop there pay attention, and if you hear similar messages and you agree they should stop, then tweet a link to this post and express your dismay to @HarrisTeeter on twitter. And if you are a human with empathy towards other humans I think you should care about this. The world is full of bigger problems but this is something where a small positive change is needed and it would take very little effort to help make that small change happen. Please also keep an eye on my tweets and retweet me when I tweet my dismay towards @HarrisTeeter, you can follow my tweets at @joeaudette

Companies use twitter as an important tool in their marketing and they do not like to see a lot of negative sentiment tweets about their brand. If we join together using the power of social media we can influence Harris Teeter to do the right thing. I’m going to try that for up to a year from now. If they tell me they have changed the policy and if I stop hearing the dreaded announcements then I will update this post or even take it down. I’m trying to create positive change, I did not want to make a stink about this, but it seems that is what it will take.

UPDATE 2016-05-10
Today was my first time shopping at Harris Teeter since making this post. In less than 30 seconds after entering the store I hear a man's voice come over the speaker and say "security scan and record camera A". I'm pretty sure that is their way of flipping me the bird and telling me they are dug in on this policy and don't want me shopping there. So I'm about to tweet "another #badexperience shopping at #harristeeter today with a link to this post and cc @HarrisTeeter. Guess I will also make another public post about it on Facebook too. So to me it seems that they are not so worried about loss prevention because they don't care if they lose customers who are bothered by this policy. Nevertheless, I intend to keep shopping there and keep tweeting and posting whenever this happens and try to spread more awareness about this bad practice. They seem pretty dug in though so I'm not optimistic that there is any humanistic leadership at that organization and in time I may have to go full boycott mode.

UPDATE 2016-05-10 - evening

After reflecting on today's experience I've decided life is too short, and I'm just not going back to Harris Teeter, it is not worth the effort and they clearly don't want me there. If you decide yourself to stop shopping there for the same reason as me, please at least send them a parting tweet linking to this post and telling them they lost another customer.

I've been a mediocre guitar player for many years, good enough to accompany myself singing and I've done fairly well playing solo gigs and in bands in years past, but I always felt that I have not learned music all the way and have not developed my musicianship to its full potential. So about 2 years ago I decided I wanted to go back and really learn music on piano/keyboard and really learn music theory and learn to read and write music and develop my ear to a level where I could transcribe music I hear and be able to write down musical ideas that I hear in my mind. I've still got a ways to go on those goals but I feel like I'm making steady progress on my piano/keyboard playing and now one of my sub goals is to learn to play from Fake Books aka Real Books which is what jazz musicians usually do. A lead sheet from a Fake Book/Real Book usually has the melody line notated and the chord symbols and the idea is to improvise around the melody and improvise your own arrangement for voicing the chords to harmonize the melody.

Aside: Historically, "Fake Books" were typically under the counter books that jazz musicians would buy from music stores or make for themselves in loose leaf binders and share with others. But this practice was often outside the rules of copyright law which is why they were sold under the counter as opposed to over the counter. The music publishing industry has tried to remedy this by publishing "Real Books" which have the licenses for publishing these copyrighted works so that the authors and composers get their royalties from the sale of the books.

Ok, so I got myself a Real Book of Jazz Standards to learn from but quickly realized that only a few of the songs were familiar to me and it got me to wondering how these songs became jazz standards and how I could get more familiar with them. So I found this book, The Jazz Standards: A Guide to the Repertoire, which gives a bit of the history of the songs that the author considers as Standards and tells you about the various recordings and interpretations that have been done by various jazz artists. Some of the songs have come in and out of favor over the years so there are many opinions about which tunes should be considered standards and there are 6 volumes of Real Books for Jazz Standards reflecting some of those changes over time in the common repertoire. But the book is a good starting point for learning about the jazz standards and not intended to be all inclusive. I got the Kindle version of the book but in hindsight I would rather have the printed copy. While it is a book that you could read from front to back it is also the kind of book you could skip around and browse and the printed version would be better for that I think.


Now that is all well and good and stimulating to the curiosity and it makes me want to listen to those recordings mentioned in the book to hear the different ways that these tunes have been interpreted over time, but it would cost a whole lot of money to just go on a buying spree to obtain all of those recordings. Fortunately for those seeking a jazz education, in the age of Spotify one can listen to those songs at any time for free with advertisements or with no advertisements and a reasonable subscription price. I know I know streaming is killing the music business so they say, and records killed live music, and video killed the radio star and all that yadda yadda, but nevertheless there is something to be said for being able to listen to all that music that otherwise would be out of reach to a poor aspiring musician who wanted to educate him or herself with exposure to this large repertoire.

Anyway I worked my way through the book making a playlist on Spotify of all the ones I could find of the recordings mentioned in the book. I was able to find the majority of them and it took quite a while to create the playlist so I thought I should share it with others who like me may want to learn about the Jazz Standards repertoire.

 

This playlist can be useful for listening to multiple renditions of each song in a row to see how different the interpretations are, or you can set playback on shuffle and just enjoy a random stream of great jazz music.