The announcement of Power BI Embedded came last week during Microsoft’s Build Conference and now enables software vendors or anyone building an application to embed the power of Power BI into their application.  I have been hearing about this feature for almost a year now since starting at HIMMS 2016 conference where I had the privilege of running the Power BI demo’s at the Microsoft booth.  While hoping that this feature wouldn’t of taken a year to come to fruition, it is something to be celebrated now.

As provider of technology with I.Predictus we choose Power BI as our data visualization platform about 8 months ago and one frustrating point was not being able to give our clients a seamless interaction between our platform and our data visualizations.  They would click to view the visualizations and be brought to a separate login and a completely different looking platform.   While the visualizations themselves were insightful and received constant praise, we always heard the comment that they wish they didn’t have to login twice, and as a technology owner myself it left a bad taste in my mouth.   I even did a POC on using the Power BI API’s to render reports, however it didn’t give me the full experience that I was looking for.

If you have not had a chance to review the Power BI embedded architecture it is rather simple and is fully hosted with Azure ARM and clients do not need a Power BI login anymore.  The pricing changes from a flat monthly fee to a per 1000 render pricing model, which depending on your usage may or may not end up costing a little bit more, but worth the cost of basically white labeling Power BI.  One additional note is that the this is currently in preview and as of now the data sources are slightly limited, Direct Query only works for SQL Azure, SQL Azure DW and HD Insight Spark and cached datasets can not be refreshed.  That shouldn’t stop you though from testing out this new feature and looking to integrate into your products.

Next week I will post a step by step blog on how to integrate an existing we application with Power BI embedded.

Announced on 3/31/2016 SQL Server Developer Edition is now free and available as a download to all who have signed up for Visual Studio Dev Essentials.  Even if you just call yourself a database guy/gal I highly recommend signing up here.  In the past the only way to get Developer Edition was to have a paid MSDN subscription and the licensing headache in a large development environment was a nightmare.   I’ve personally taken over 20 hours of Microsoft licensing training and I still couldn’t fully explain it to you.

This announcement significantly clears up the licensing model for developers,  remember you can’t use Developer edition for production environments or production data, but for all your development and test environments just sign up for the Visual Studio Dev Essentials and now you’re covered.  Now licensing isn’t the only advantage here because Developer Edition has all the features of Enterprise Edition, so if there are features that you want to explore that aren’t currently in use you now have access to them.   For myself I know I have been in the situation many times where someone was asking, “Why should we upgrade to Enterprise”, now you can build out a POC using those features in a dev environment and pitch the advantages to management.

With all these tools now being free and available it should also expand SQL Community, create more SQL professionals in the long run and lead to more qualified candidates for open positions.  This is something that all existing SQL professionals should be championing and sharing however you can.

One of the more underused features of Power BI is the Google Analytics data source.  Working closely with marketers and agencies I continue to see them struggle with getting and reporting on their data from Google Analytics and it should not be that difficult.   With Power BI or even with Power Query in Excel you can easily get, model and visualize this data with only a few clicks.

Let’s walk through how easy this is, first step is to get the data from Google Analytics which is done by selecting get data and choosing Google Analytics and logging in with your account.

image

Once logged in you will see a list of sites that are being managed under your account and from here you drill down into your site and select the data you need.  For this example lets say I want look at the number of hits on my site and be able to separate new users from existing users.  For this I will need to get hits from session, User Type from User and Date under time.

imageimageimage

 

Now that we have our data elements click load and the data will be loaded into Power BI.  To add a little depth to my report I did two things.  First I imported a date table, and if you don’t already have one here is a great blog to create one in power BI.  Next I created a custom column in my date table to specify records in three categories, “Last 30 days”, “Greater than 30 days” and “Future.

DAX : Last 30 Days = if(Now()>=[Date], if(DATEDIFF([Date],NOW(),DAY) <=30,”Last 30″, “Greater than 30″),”Future”)

Last I created a measure to show goal of hits by using hits from the previous year with 20% increase.

DAX: Hits Goal = CALCULATE(SUMX(‘I Predictus (2)’,[Hits]),SAMEPERIODLASTYEAR(‘G_Calendar'[Date]),all(G_Calendar)) * 1.2

Now I can create a couple simple visuals one showing overall hits in the last 30 days to goal and the other daily hits by user type against overall goal.

image

This is just a small example of the data available via the Google Analytics data source.  If you are using Google Analytics then I suggest you start using this data source and start developing your reports and dashboards.  Oh and how could I forget to mention that once you have these developed you can schedule the refreshes to be automatically done.

If you missed it last week, Microsoft released a Service Update for Power BI , and a long awaited feature is now here.  You can now export the data that is behind a visual and this will close a huge gap for Power BI when compared to many of it’s competitors.  For myself this is beyond huge as all I have been hearing from clients for the past 5 months is, “When are exports going to be available?”. The answer is finally here and it is very simplistic to use.

For any visual all you have to do is select the menu in the top right and choose export data.

image

It is that easy and you will get a download the data in excel for that visual.  One note is that it does not drill down into the data.  The export will have at the same level the visualization is at.  For example in the above visual the following are the results.

image

However if you choose a visual with more detailed data such as a table the below would be the visual and result pair.

image   image

Or in a column chart

image image

Happy exporting everyone!

While I never really enjoy a cross country flight back home my feet are very happy that I am finally sitting down for a few hours.  CES 2016 was once again another great year and has proven to me again why it’s a must attend conference that spans across many technologies and reaches almost every industry.  This was my 5th time attending and over those years I have represented technology in several different industries, but this was my first focusing on marketing.  In most of my previous years I was looking at what technology could be built on top of or integrated into what I was already doing.  However, this year the focus was more on different ways to interact with the consumer, along with understanding the consumers buying patterns across all industries.

There certainly was one very common theme that you saw at every turn, “Mobile is king”, and it’s not just the phone that we all can’t live without anymore.  It was also the theme in all the auto industries connected cars with Wi-Fi enabled cars, integrated entertainment with Amazon, to self driving vehicles.  If the car can drive itself, that gives you more time to spend on your mobile devices or enjoying the entertainment systems.  From a content delivery aspect imagine not having to rely on radio and the spoken work to get the attention of those who are driving and delivering targeting advertising based on the location of the vehicle.  In my opinion we are not far away from this becoming a reality in the next couple of years.   One additional thought here as well is the car becoming even more integrated and say when the gas light comes on, it uses GPS and Wi-Fi to automatically search for the nearest gas station and give you directions.

From a phone aspect there was every gadget you could think of from phones longer battery life, larger screens, devices to give you additional battery life, protect your phone, boost your signal and protect your personal information.  All these tell me one thing and that the mobile phone has become a device that most people no longer know how to live without, myself included.  In almost every keynote, session, panel discussion or fireside chat I attended they all talked about the importance of or new ways to interact with the consumer, who spends over 14 hours a day with the phone no further than an arms length away and averages around 4 hours actively looking at the screen.   There is no doubt that the shift of marketing dollars and content will only increase in the digital space as mobile continues to gain not only additional eyeballs but also a greater share of recreational time compared to TV.

From a tradition TV marketing stand point everyone was in agreement that TV viewership is down and that is a trend that most likely is not going to change.  However, DISH network is certainly trying to make sure that not everyone walks away from there TV or cuts their cord.  There new DSTB is capable of 4K resolution and more importantly it is integrated with Sling, YouTube and Netflix.  Which points to a new trend of, if you can’t beat them join them.  Netflix has certainly disrupted the likes of both cable and satellite providers so why not build in Netflix search and try and keep them from cutting the cord completely.  It’s also a very smart move as it gets a greater amount of information about the consumer, their watching habits and increases the ability to provide addressable media. 

Overall the media landscape and consumer interactions continue to evolve as once again shown this year as CES and this upcoming year will show great strides in perfecting the interaction with the consumer.  None of the would be capable thought without the innovations taking place with consumer technology.   It may scare some about how much everyone will know about them but I for one will enjoy the day when I am not getting hit with ads, emails, calls, etc.. that have nothing to do with me or what I am looking to buy.

Be an Expert not an Ehhpert!!!

Posted: January 6, 2016 in General
Tags:

As we embark on a new year I decided to take a look back at my career within I.T. trying to figure out what was the turning point.  When did it go from a job to a career?  What advice could I give to others?  After thinking about this for a couple days I realized that everything changed when I started to care more about what I learned than the paycheck, and that moment for me was about 15 years ago.  Accountants-Sweat-Shop

I was working at the time for what could be called an I.T. sweatshop with 70+ hour weeks and all I cared about was getting the next paycheck and paying my bills.  Days were long and I wasn’t learning anything new just plugging along, writing code and it was time to start looking for a new job.  About two weeks later I had a new job offer, but it was a decrease in pay, and with a young family at home I had a lot to think about.  Time to make the tough decision of money or the opportunity to learn more, and I chose to learn more.  At the time it seemed like a small step but it turned out to be one of the best decisions I ever made and would also make the same decision again about 10 years later.

Right now also happens to be the time at I.Predictus when we are setting the goals for a new year and this years theme is appropriately, Play It Up.  As a leader of an organization there are many things your constantly worrying about,  but the most important to me is how do I set a good example for my team and how do I keep my team focused and energized.

So how does all of this have anything to do with being an Expert? and what the heck is an Ehhpert?

An ehhpert is the person that is always every questions with:images

  • Ehhh let me think about it.
  • Ehhh I’m not sure
  • Ehhh I don’t know

 

If this is you then stop doing it immediately and make today be the day that you become an expert.

Becoming an Expert and Playing It Up go hand in hand and this is what I am challenging my entire team to do this year. Additionally, I encourage everyone to also challenge themselves or your teams to do the same.  Surrounding yourself with intelligent people raises everyone up another level and causes a greater accountability with peers and a little healthy competition.

How do you become an expert?

  • Find a lane and don’t stop learning
  • Don’t spread yourself thin in multiple areas
  • Be confident and come across with conviction
  • Help your team and teach them what you know
  • Let everyone know you’re an expert
    • Blog
    • Present
    • Speak

If asked this question two years ago, my answer would have been a resounding “Yes.” At that time, the focus was on displaying data using visualizations that were pleasing to the eye to help make a sale, impress the “C” levels, and to put a check mark in the box of capabilities. But times, and needs, have rapidly changed.

The tools available just two years ago were cumbersome to work with and the landscape only had a few major players. Fast forward to today, and the landscape is full of choices in both on premise and cloud based tools. It’s not just the toolsets that have changed however. It is also the maturation of best practices that have brought data visualization from a buzzword to a necessity for all companies, large and small.

So what are some of the best practices? How do I make data visualization succeed for my organization?

Here are a few keys to ensuring success.

1. Use the right visualization.

It’s a lot like when the newest iPhone comes out, and everyone races to be the first to get it. Only after the purchase however, you realize that there wasn’t that much of a difference from the previous version and you gain no productivity. Don’t be in a rush to use a new visualization just to say you can use it. Make sure that it tells the user something and is organized in a way that they can learn quickly. Nothing will kill a dashboard or report more quickly than ineffective visualizations.

2. 10 seconds to learn

The key to any dashboard or report is that the consumer must be able to have at least one takeaway in the first 10 seconds. This will lead to a higher adaption rate and make your audience come back begging for more data.

3. More isn’t better

Don’t overwhelm your audience with a dashboard or report that has too many visualizations. We have all seen demo dashboards with 10, 15 or even more visualizations. Although it looks impressive, it’s best to ask yourself questions such as: ‘Did I learn anything?’ and / or ‘Did it tell me a story?’ Most likely your answer will be no.

4. Tell a Story

Make sure that the data is always telling a story and not just being displayed for the sake of displaying. This is often found in situations where an attempt has been made to move an operational report into a visualization tool. Operational reports have their place in the story but it should be 2 or 3 levels deep with supporting visualizations first that have a drill path to the transaction level details.

5. Use Files/Slicers

Don’t create multiple copies of the same report just to show a different product, category or client. Use filters and/or slicers to help slice and dice your report and give your audience one place to go to get all their data. This gives one other benefit that individual reports don’t give and that is aggregate values across multiple products, categories and clients.

Take a look at the two reports below and try to apply the keys to data visualization above.

Report 1.

clip_image002

Report 2.

clip_image004

Did you notice that each report has the same data? Were you able to learn more from Report 2? While both reports have the same data, the right visualizations makes a difference in immediate comprehension. Using column charts to show trended data over time verses a bar chart allows the eyes to easily move from left to right and see the patterns. Next, tree maps are a great replacement for the age old pie chart, as not only do they show large amounts of data in a readable format, they also take your eyes in a progression from left to right as the size decreases. Additionally, funnel charts tell a much more concise story compared to a bar or line chart and draw you to not only see the values decrease, but also creates a visualization that quickly shows the disparity between values.

What’s next? If you haven’t already begun to adapt the concepts of data visualization within your organization, now is one of the best times to start. There is a right tool out there for everyone. If you find yourself wondering where to begin, reach out to a provider already offering data visualizations in your area of expertise. You don’t always have to recreate the wheel to take advantage of the areas of efficiency that they offer.

SQL Saturday NYC is going full steam ahead and we have finalized our 2 Precons for Friday May 29th.  Before I announce what they are I wanted to first say thank you to everyone who submitted it certainly made making a choice very difficult.

 

Precon #1:  Bradley Ball will be doing SQL Server Performance Tuning check out the full event description here, and hurry we only have 40 available spots.

 

Precon #2: Allan Hirt will be doing The Good, Bad and Ugly of AlwaysON Availability Group you can check out the full description here.  We only have 30 seats for this class and have a great early bird offer of $200 off the regular price at $299 that ends May 1st so don’t delay.

 

On another note we have gained a few new sponsors with New Horizons and Profisee so check out there sites and see what they have to offer.

SQL Saturday NYC Planning is in going well and we are now getting ready to finalize our precons as 2 of them are on Friday May 29th and we have had many submissions so far.  We are taking submissions until 1/15/2015 so just under a week left to email them to sqlsaturday380@sqlsaturday.com,   Our attendance numbers have been doing very well and we are now up to 103 people registered and 118 sessions submitted.  The session selections will surely be a tough one with so many good sessions submitted.   We will have a great lineup of speakers and I encourage everyone to register ASAP.

Thank you to all Sponsors so far as we now have 5 and anyone thinking of sponsoring hurry up as we only have 2 gold sponsor spots left that guarantee you a slot for the lunch sessions.

 

Now on to start planning Swag and Speaker Shirts.

There are several really good blogs that already exists that explain the differences between Table Variable, Common Table Expressions (CTE) and Temp Tables and here are a few.

However once you know the differences how do you decide on which ones to use?  Now I know there is the always dreaded it depends which is an answer that I do truly hate so instead I decided to put together some best practices to serve as a guideline on when to use what.  

CTE

In general what I like to say is only use a CTE if you need recursion, and yes I do realize that there is almost no difference between a CTE and a derived table.  However I say only use if you need recursion because there are some cases when joining a CTE multiple times can create a really nasty execution plan. 

Table Variable

Use a Table Variable in the following scenarios:

  1. Small tables
  2. Data is only inserted once
  3. Data has no need to be updated
  4. Known number of rows that doesn’t change

One of the reasons for the above recommendations is that a Table Variable is assumed to have a low number of rows usually 1 for generation of execution plans which can cause a cardinality issue.  If you read Ben Snaidero’s article above you will also see the performance hits for updates to temporary tables.

Temporary Table

Use a Table Variable in the following scenarios:

  1. Large tables
  2. Need for additional DDL, such as an index
  3. Need to access data in child stored procedures
  4. Need statistics
  5. Data needs to be inserted multiple times or updated
  6. Unknown Data size

Temporary tables as you can see have a much broader use and are more of the catch all for everything that doesn’t fall into the CTE or Temp Table bucket.  One thing to remember is that if you have a data set that needs to be generated that needs to be reused through a process of multiple stored procedures is that a Temp Table can be used in child stored procedures in the execution tree.  This is a great way to generate temporary data only once and access it in multiple stored procedures. 

Conclusion

Remember that these are just best practices or in other words a starting point and one that can be generally applied regardless of the skill set of the SQL developer.   As with anything there is always times where the data science will call for an exception to the rule and when it does that is when you go outside the best practices.