T-SQL Tuesday #148 – Advice on running a user group

It’s T-SQL Tuesday!!

T-SQL Tuesday #134

T-SQL Tuesday #148

T-SQL Tuesday is the brainchild of Adam Machanic (Blog | Twitter). December 2009 was the first T-SQL Tuesday invitation that went out by Adam. It is a monthly blog party on the second Tuesday of each month. Currently, Steve Jones (Blog | Twitter) organises the event and maintains a website with all previous posts. Everyone is welcome to participate in this monthly blog post.

The Ask

This month’s T-SQL Tuesday is hosted by Rie Meritt ( Twitter | LinkedIn ). Rie says: “For this edition of T-SQL Tuesday, I’d like to ask everyone to write about all the various aspects of running a user group.”

The original post is here.

My contribution

Building out a session schedule for your user group

I’ve been planning and running our dataMinds user group evenings for a few years now, and have a learned a few “gotchas”  and “aha’s” while doing so. With what I know now, I could have things so differently when starting out with this. And for that exact reason, this is a T-SQL Tuesday I was jumping to for participating. If someone else can benefit from my random thoughts, than I’ll be very pleased about this!

Listen to your Target Audience, and include them

For me, this is the most crucial thing about building your session schedule. Most user groups have a pretty specific scope where they specifically focus on Power BI, DBA, Data Engineering, .. topics. Early on, we made the decision to cover the entire span of the Microsoft Data & AI landscape. It gives us the chance of building in lots of variety in speakers, topics, complexity.

A definite downside is that when we’re switching from the hardcore SQL Server Internals stuff to Power BI visualisation, we see a big change in the attending audience. It’s harder to build up a strong bond, as most attendees will not want to attend the majority of the sessions. And still, we like the way we’re doing things and get some good feedback from those attending with us. I know, it’s all a bit different these because of the virus thingy that shan’t be named, but the patterns still come through quite clearly.

We actively ask our user group attendees, and recipients of the newsletter to come to us if they want to see specific things covered in a session or speaker. Even better, something they want to cover themself. By listening to the things your users are facing in their occupations, you have a bit more certainty that you’re planning sessions that can actually help them.

Pick your dates

2 years ago, we switched to planning the event dates at the beginning of the season (September to June). For me, this has worked so much better as I now just plan for an entire quarter with the dates I have availabe. I know other groups take a certain day of a certain week every month. It all depends what works for you, and your attendees. As we’re now doing 1 – 2 evening sessions per month, we try to alternate between days of the week, to make sure we can include other people too.

The art is in the balance

Depending on that target audience for your user group, it’s all about finding a healthy balance between speakers, topics, and levels. Are you including sessions that vary between introductory stuff and more complex deep dives? Is the same speaker presenting at your user group every quarter, with a decline in attendance? Are you covering only Power BI stuff when you also have a large DBA audience?

There’s no silver bullet here, as it will be catered to your specific situation. But I think striving for a healthy balance based on your member feedback is a healthy way to progess.


In the olden days, our user group held its evening sessions on site at one of our partner companies. With usually 2 sessions planned per evening, it would be one of their employees presenting a community oriented session and another speaker that fits well with this. The partner provides the location and catering, meaning we can run the user group as a low cost effort. Then, because we’re essentially travelling through the country (It’s Belgium, so don’t think too much of it), we cater to people living in the different parts.

Now with the virtual sessions, it’s mostly 1 extended (60 – 75 minutes) where the speaker can take their time to explain things, with plenty of room for a healthy portion of banter.

With things returning to a steady situation in Belgium, we’re actually planning a return to in-person activities as we speak. For now, we plan on the same approach as we did before, and adapt if we see the patterns changing. One thing I’m personally worried about is the increased percentage of no-shows. When we’re going back to ordering drinks and food, I actually do care if someone shows up..

Network, network, and network. Did I mention network?

Now for the tricky part .. How do you actually find those speakers for a healthy mix? What worked for me was networking, in a variety of forms. Starting public speaking myself really helped me increase my range, but it doesn’t have to be this way. What matters most is that you try to build connections with those people you’re hoping to plan on your schedule, or that you meet people that can introduce you in a proper way. It’s not always easy, but so rewarding when you finally land those people you think will deliver a killer session for your group. Social media has changed this process drastically, where it’s become so much easier to get an overview of what is happening, or which interesting sessions pop up somewhere else.

Then, it’s keeping those eyes and ears open, at all times. Do you see a local person posting an incredibly useful article on LinkedIn? Why not ask them to turn it into a (short) session? Do you have a person that’s consciously attending most of your sessions? Why not check if they need a nudge to get on stage themself? Maybe you’ve heard of a consultant that has done some crazy stuff at a client to get something complex to work? All simple examples that can really help you build out the schedule.

And then, try to look at other user groups or conferences. Which sessions are being planned there, and can be a great fit for your audience? You’d be surprised of the interesting sessions you can come up with this way.

Be an opportunist!

Then, when you’re browsing those social media you may run into a message by a speaker saying they’re travelling to your region for whatever reason, or you’ve heard it through the grapevine. It never hurts to send them a message to check if they’d be interested in doing a quick session for your user group. It won’t always work, but it can lead to an interesting addition to your schedule!

Pick the tools you prefer

I know Sessionize now set up a User Group mode, that can have multiple installments of dates, where speakers can apply to. Personally, I still prefer reaching out to speakers myself, as this gives you better insights into who you’re asking. Maybe a speaker you want to plan, wouldn’t submit out of their own iniative. But, it’s the other way around as well, as you can get interesting submissions you wouldn’t have thought of otherwise.

Whenever I reach out to a speaker, I do it with the same standard template I’ve built over the years. It contains some information about how we run a session, which session we’d like to plan, and the dates that are still open. With the remote approach of the last 2 years, we also ask up front if we can record, upload, and share demos afterwards, with a clear indication that this is not mandatory.

Then, when we’ve agreed on session and date, I set up the events on our own website and Meetup, and send out a calendar invite to block calendars for all parties. One last tip, don’t be too optimistic when reaching out to speakers. For example, don’t reach out to multiple speakers for a single date if you can only plan one. It’s poor planning if you have to come back on your words because of a double booking.


Another quick ‘n dirty writeup for T-SQL Tuesday, but I really felt I had to include my thoughts on this one before heading off to London. As always, I’d love to hear from you if you have any questions, remarks, or rants!

Stay safe, take care!

Leave a comment

March Madness! The Return of the Conferences (SQLBits and Global Power BI Summit)

I’m stoked! With just a few weeks left to go, I’m blessed to get to go outside again, and visit London. Throughout March 8th – 12th, I will attend SQLBits, and even present two sessions myself. Happening in that same week, is the Global Power BI Summit, as an online event. Here, I will be presenting two sessions, and hosting a table talk with wonderful people. Personally, I’m really excited for this, as it means I get to engage with our community, have chats with customers, and reminisce about things with friends I have not seen in a loooooong time. Both of these events aim at a broad spectrum of attendees, and offer an incredible range of speakers and topics. Driven by a community background, this is about sharing the passion for our data platform ecosystem, and helping each other achieve more.

SQLBits (March 8th – March 12th 2022, @ ExCel London & Online )

Learn more about SQLBits

Learn more about SQLBits

At SQLBits, you can find me wandering the hall(I’m not that hard to spot 😉), having a conversation at the Microsoft Booth, or attending one of the many great sessions. All of these with a 90% probability I have a coffee in hand. I’m particularly pleased a healthy number of non-tech sessions made it on the schedule. I always like attending sessions of this nature, as they teach me things that are harder to figure out on your own. For instance, I can quickly Bing something on a certain DAX construct, but listening to someone share their personal story on mental health would be much harder to replace by crawling through your favorite search engine.

Registration for SQLBits is still open, and you can still grab your spot! Are you not feeling a 100% comfortable to make the trip? SQLBits is being hosted as a hybrid event, and you can attend from the comfort of your own home, at a seriously reducted price. Find out all the details on their registration page.

That said, there are tons of techy stuff on the board I am looking forward to as well, even beyond my Power BI comfort zone. And I’ll be honest, the return of the “hallway” track is what is making me all giddy for early March. There is something about that conversation in the hallway where you get that absolute honesty and cutting edge discussions that does it for me.

If you want to come attend one of my sessions, these are your options:

  • Thursday March 10th at 4:40PM UTC I will share some of the things I have learned on asking questions, and helping those that I want to help me. Everyone can ask simple questions, but there are a few aha’s and gotchas if you want to take it further
  • Saturday March 12th at 9:30 UTC I get to spread the joy about Power BI dataflows❤️. This is an introductory session about Power BI dataflows, answering a few questions about why I think they could be a good fit for you.

I may make a cameo here and there in other sessions, but that is just something you will have to find out on the spot…

 Global Power BI Summit (March 7th to 11th 2022 @ Online)

Learn more about Power BI Summit

Learn more about Power BI Summit

This year we will see the 2nd edition of the Global Power BI Summit, that goes all in about Power BI. This is held as an online only conference, and repeats sessions multiple times to accomodate for different time zones across the globe. You can find an amazing range of topics and presenters on the line-up, and a really good online experience for a conference. I’ll probably try to attend a few sessions here and there, and also present a few things myself. You can find out all about the details on their website and registration page

At Power BI Summit, these are the sessions I will be involved in:

  • On Tuesday March 8th at 10:30 AM UTC +1 (and a repeat at 10:30 PM UTC+1) it will be all about Power BI Premium Gen 2. This has shifted into General Availability, and the deadline for migration is inching closer. I’ll handle some of the common questions I have seen, and share some practical insights you can take back with you.
  • On Wednesday March 9th at 10:30 AM UTC+1 (and a repeat at 10:30 PM UTC+1) I will share my tips and tricks to keep up with Administering and Governing your Power BI Tenant. Expect some practical tips of things I have picked up in the past, and will make your life as an (accidental) admin easier.
  • Then, On Friday March 11th at 10:30 AM UTC+1 I’ll join the panel of a Table Talk with Thomas Martens, Štěpán Rešl, and Nicky van Vroenhoven where we await all your questions and input on Power BI Administration, Governance, and Data Culture. There is also a repeat at 10:30PM UTC+1, but I will not take part in this one due to an activity conflict at SQLBits 😊.

 See you there?!

Leave a comment

Writing Session Abstracts (Data Minutes)

Data Minutes #2 took place on January 21st 2022

On Friday January 21st 2022, I had the absolute joy and pleasure to present a lightning talk at Data Minutes, ran by William Durkin and Ben Weissman. 10-minute timeslots assigned to a large number of speakers, where I used mine to share my thoughts on writing a session abstract for conferences, user groups, or other types of events. I mostly kept our Data Platform / Power BI (community) conferences in mind, as these are the type of engagements I am most experienced in. Basing myself off my prior activities as an attendee, a speaker, a program committee member, and a conference organizer, I thought back on things I liked when looking at session abstracts. If you are interested in watching the recording, you can find it at the Data Minutes YouTube channel (link), and find the slides over on my GitHub page (link)


In the end, this is a single person writing down thoughts on what works for them. As a result, there are a bias and subjective thoughts involved, and my advice is to take these as nuggets to mold into your own set of handles. Every conference, user group, .. has their own set of subtleties, and will have different things they require and prefer. Meaning, this is not your “easy-mode, get accepted anywhere” solution. You are still the one responsible for providing quality work, and doing the research.

Just 1 more thing before we start.

Before you’re starting to write your abstract, there are a few things you may want to consider. For me, the most important thing to ask yourself is this:

Why do you want to present your session? What are your goals?

Are you simply happy to share your experience? Do you want to have standing room only in your sessions? Are you looking to promote yourself, your product, your organization? There are no wrong answers on this question, as whatever works best for you is what drives your ambition.

The one thing that took me some time to realize, is that you are not trying to draw in as many people as possible to your session, but you’re trying to keep out those people that don’t fit well with your target audience. It’s an odd statement, I know. But in the end, you want to have the people that attend your session to be satisfied with what they have seen, attend other sessions presented by you, or maybe even do business with you. People that misunderstood your intent and message, have a higher risk of being discouraged, and they might not want to attend another session by you again. If you are in it for the long haul, you’ll definitely want to see people attend multiple times, as the attendee pool is not all that large as you might think.

Then, I want you to think about where you are applying to speak, and research the subtleties of that activity. Most organizers put a lot of effort into describing their target audience, and the types of sessions that have worked for them in the past. In essence, they are handing you the building blocks for you to engage with their audience on a silver platter. Is this a more formal conference? Are they looking for 30-minute sessions only? Is this specifically aimed at launching new speakers? Read up on the details organizers provide you, and do some research about prior editions (if applicable). Odds are likely you will find some really useful information you can turn to your advantage to increase the odds of being selected.

Why bother?

Writing these abstracts isn’t just a trivial task you get out of the way because you have to do it. Most of us don’t have the reputation or relations to dictate where we want to speak, we have to prove our proposed topic will be useful to include. After having written the abstract, multiple groups of people benefit from this, to use in their decision-making process. The abstract is a tool in your belt for you to sell yourself to them, and get a chance to share your thoughts on a topic. For me, those stakeholders are:

  1. An organizer, and/or member of the program committee
  2. A potential member of your audience
  3. Yourself

You are pitching yourself to organizers and committee members, as they usually make the decisions who gets planned on their conference schedule, and what sessions are compatible with their goals. You are pitching yourself to members of the audience, as they will assess if the session is right for them, and they will learn something new or have a good time. Unfortunately, some audience members don’t read anything, and end up voicing their discontent (/rant). At some conferences, it is even the audience members voting for which sessions end up being planned.

Most importantly, you are pitching the session abstract to yourself, as this is your first formal moment to think about the scope of the session, content you want to cover, which personas you would like to have attend your session, .. This is where the intent gets a form of reality, and you have to deliver something before next steps can be taken.

Things you’ll want to define

Okay, we’ll start going into the actual abstract, just after you answer these 4 simple (not really) questions.

Target Audience? Are you planning on going very technical, discuss business scenarios, ..? Do you want to give useful tips to new starters, or provoke the seasoned veterans to think hard about a specific subject? Usually, you can distil a target audience if you ask yourself questions about the message you want to send. The target audience for your session has to match with the target audience at the place of your speaking engagement to obtain the best results, especially if you want your attendees to have interest in attending your sessions again.

Session Level / Complexity? In the Data Platform realm, sessions are typically measured using a numeric value in the range of 100 to 500, but these ranges can vary often. They are designed to represent session complexity in an increasing scale. A level 100 session will typically be an introductory session, where a level 500 (or higher) session will be at the expert level. For the sake of providing you with a practical example, I’ll use the session levels we use for dataMinds Connect. 100 (Introductory and Overview), 200 (Intermediate), 300 (Advanced), 400 (Expert), 500 (Guru), and 9000+ (Over Nine Thousand). I’ve witnessed a lot of discussion about levels in the past, and am well aware that this is not exact science. Being as transparent as possible about learning objectives will help you set the session level.

While this number seems trivial, it is a very important tool for you to provide information about your session. Not every audience member reads an entire abstract, but they will base themself off a session title and session level. It is in your best interest to consider the level of your session. For instance, organizers and committees may be specifically looking for an Introductory session on a certain topic, or a session that is diving very deep in some internal stuff of the product. The majority of sessions get submitted in the 100 – 200 range, which means you want your session abstract and topic to stand out.

Prerequisites? Do you expect a session attendee to have prior knowledge about SQL? Do they need to able to understand how joins work? Or how Query Plans can be read? Is it an absolute must to understand Filter Context in DAX? If you are planning on building on a certain topic, and are making assumptions about the knowledge of your attendee(s), it makes sense to make that known. Again, it is about creating that bond between yourself and the attendees, and having them attend your sessions again in the future.

Learning objectives? When everything is said and done, what do you think are the key topics an attendee could have learned? If you consider something to be a key topic in your session, you will most likely want your attendees to pick this up as a learning point from attending your session.

Common Structure

In the majority of conferences I’ve engaged with, a session abstract consists of 4 key segments. In some larger events, there are more segments added, especially when program leaflets are being handed out to attendees. Our community is looking like it is standardizing on Sessionize, which provides these as standard. To limit myself to what I’ve encountered in most cases, these are the key segments:

  • Title: 1 sentence, used in schedule, website, leaflets, ..
  • Abstract (Body): Synopsis of your session, typically 3 – 5 paragraphs
  • Notes: Private to you and organizers/program committee
  • Bio: Personal presentation
  • Blurb (Short version): Limited number of characters allowed (ranging from 140 – 200), to explain session outline, and to be used for a more detailed schedule. This is more of an exception to what I have encountered, but it is representative enough to include.

Title (Short, Sweet, Fantastic).
Personally, I prefer these to be short and catchy. Briefly describing the problem that will be (attempted to be) solved, or describing the scenario at hand. At maximum, I’ll make them 10 words long. This is my bias as an organizer showing, as long titles impact formatting on pretty much everything we design for a conference. This means schedule, website, intro slides, posters, .. But also, as an attendee you might be turned off if this is a very long sentence, for something that could probably be explained in a few words.

Abstract (The Meat ‘n Potatoes).
As a rule of thumb, you will want to avoid stating the exact same thing as in your title, or put in a ‘to do’. As I explained before, the abstract is your pitch and you want to it to be thorough and useful for those reading it. For writing the actual abstract, we are going to reuse the results of the questions you have answered before, as these are things we definitely want to use.

To start, it makes sense to describe the problem we’re trying to solve, the business scenario we’re facing, or describing the situation. Then, we want to include the audience, prerequisites and learning objectives we defined before. This should result in about 3 – 5 paragraphs, which I think is the good balance between having enough text to explain the specifics, and being too long causing no one to read it. But again, this is a personal preference.

When writing the abstract, I try to look out for usage of good grammar and spelling, especially about product names, or industry related terms (I will defer from starting a riot about AlwaysOn at this point). Yes, people make mistakes and we are all allowed to do so. But if a quick quality check can fix these problems, it will cause a lot of people to make a prejudice that may not be true.

Depending on the conference you are submitting to (do the research!), you can find a balance between formal writing, and sneaking in some quirky remarks or references. The writing style can swing both ways, which is often forgotten. Write too formal, and a community driven, lighthearted conference might not want it. Write very informal, with lots of lame jokes, and you may not make the cut at an academic research papers conference. Again, do the research and find out what works!

Then, I try to avoid using sentences like: ‘we will look at a number of techniques’, or ‘we will describe a few scenarios’. They are very vague, and probably written at a time you were not completely decided on the content you wanted to include in your session. Instead, it can pay off to quickly describe the items you want to cover, as an attendee can decide if they are new and exciting to them, or they may not be as relevant to them as you think.

Personally, I’m not a big fan of including parts of a biography in the actual session synopsis. For instance : ‘Join John Doe, author of Book XYZ and presenter of show ABC with over 25 years of experience in 17 different technology companies throughout the globe with distinguished accomplishments 1234 ..’ is something I think is better suited for the presentation of the speaker, not the session description.

Notes (Insert Bribes Here**).

As an organizer, I barely see the Notes section being used, but it actually is a really useful space. Anything you put in here is private between the organizer, program committee, and yourself. This is an excellent place to include feedback or references from prior versions of this sessions, or the indicate your willingness and flexibility to make this session fit better into their schedule by changing complexity or adding an extra solution method. In the end, an organizing committee always has a certain idea of which types of session and topics they want to include, and it is up to you to prove it can be a good fit.

** For the sake of completeness, this is not to be taken as a serious remark.

Bio (About yourself).

Now let us present a few things about ourself, that can be relevant to the story we want to tell. Depending on where we are submitting, the writing style can vary. In the end, I personally try to keep this somewhat professional regardless of where I am submitting. It can definitely contain some quirks and references, but I don’t think anyone is interested in the fact that you ate 37 hotdogs at the company picknick in 2017.

Which leads to the point that you want to include relevant and updated information about yourself. If you have changed employment 4 years ago, it is definitely time to reflect that in the bio as well. Also, consider the photo you are using for your abstract. Do you really want to use the “after photo” from said picknick in 2017, or the late hours of the Christmas Party in 2018? The photo you choose here, definitely has an impact, so consider your options.

Then, make sure you are including links to you online portfolio, or places where people can learn more about you. Think your blog, LinkedIn page, Twitter profile, GitHub Repo, .. If you already have supporting videos, blog posts, or prior instalments about your topic, this will definitely help your case.

Before you submit

Good, we’ve written everything we needed to. Let’s hit send as quickly as we can, right? Right? As my final piece of advice, I suggest you put some time into the reviewing process. First, make sure you review the content yourself, to assess if it effectively portrays the message you want to convey, and that there are no large grammatical errors included.

Then, I have always had great experiences with getting external opinions on what I wanted to submit. Ask other people to review the abstract for you, and answer a few basic questions. If the responses come back to something completely different than you would expect, you may want to review the abstract again. The questions I am referring to are:

  • Who should attend the session?
  • What are we trying to solve / describe?
  • Why are we doing this?
  • What are we learning?

Reaching out to other people can be virtually anyone, and they don’t even need to have prior experience in the topic. Heck, they don’t need to have any knowledge about the subject domain at all. If someone that is completely new to the topic can answer the questions, you can say for sure the message is coming across the right way. But also, other speakers and organizers can have valuable input for you. Cathrine Wilhelmsen phrased it so well by stating we are “aggressively friendly” in our community, and we will always try to help, or find someone who can.

Wrap Up

To wrap up after another lengthy post, I want to thank you for making it to this point. I’ve shared my thoughts here, and I hope you can take away a few things to mold into what works for you. To conclude. Do the research, and be thorough!
Let me know if you have made some changes to your session abstracts, and if it helped you!

Take care!


Buh bye, 2021!

Buh bye, 2021!

In my round-up for 2020, I mentioned it had been a weird year for mostly anyone. I’ll go out on a limb, and state that 2021 has been just as weird. And yet, I don’t have major reasons to complain, and am incredibly grateful for it. 2021 has been a rollercoaster for me, with plenty of things to reflect on. Overall, I’m pretty pleased with how everything has gone, and I’m hoping I can continue 2022 on this trend.

I didn’t set any fixed goals for myself for 2021, and am glad I didn’t do so. I’m sure I wouldn’t have achieved the ones I would have set out, while still having done so many different things. To look back at what I’ve been up to this year, I’ll break it down into these categories:

  • Professional
  • Community – Personal
  • Community – dataMinds
  • Personal


In short, I made a career move! Around the May-June 2021 timeframe, a lot of things were shifting at the former employer, and it made me realize my heart wasn’t a 100% in consulting anymore, and that I had to start looking at my options. I didn’t feel like switching over the a different consulting company in Belgium, or start doing my own thing as a freelancer.

When I noticed a job posting for Power BI CAT in Europe had opened, I knew I had to move quickly to get my stuff together. I managed to get everything sorted out, and got my application in before it was closed off. Some conversations and interviews later, I got word a proposal was coming my way, which I decided to accept. I took my fair share of time to think things through, but quickly realized I’d only blame myself in the future, if I didn’t give this my absolute best effort.

I joined the mothership on December 1st as a Program Manager in the Power BI CAT (Customer Advisory Team), after 10 years in consulting. I’m now a part of the Europe/Rest of World team with Rui Romano and Lars Andersen, led by Chris Webb. We’ve got exciting new people lined up to join us in 2022, so this will definitely be incredibly interesting! 1 month in, I can say it’s a real change from what I was doing before, and I still have so much ground to cover.

Microsoft is a large organization, and is not easy to navigate. Luckily, every single person I’ve spoken with so far has been exceptionally welcoming and helpful, and really helped me to get settled in. Now that I’ve worked my way through my onboarding materials, and am starting to get caught up on all of the super duper secrets and codenames, I’m looking forward to helping organizations achieve more with Power BI.

Community – Personal


Joining Microsoft also meant I retired from the MVP Program, and am no longer a Data Platform MVP. A pity, as I really liked the options it gave me to connect with other people in the MVP Community, and it absolutely plummets my chances of getting to attend an in-person MVP Summit at Redmond Campus. I’ve had heaps of fun when engaging in these activities, and will definitely miss it.

When I was preparing the figures for my round-up I thought 2021 had been a slow year for me in the form of presenting community sessions. Looking at the numbers, I realized I was horribly wrong as it actually has been quite the busy year with 28 sessions presented over the course of the year. The majority of these have been virtual, with the only exceptions being Power BI Next Step (September) and South Coast Summit (October).

In April 2021, I did decide to tone it down with presenting virtual sessions, as it was simply not giving me the satisfaction I had before. I still want to do them, but at a lower frequency. Virtual event fatigue is real, yo! That said, I do appreciate all the organizers for the time they put into putting on these user groups, conferences, .. Because of their effort, I get to present these talks across the world.

I have some ideas for new sessions brewing, so I’m trying hard to take the time to work out these vague ideas into actual sessions. My speaking schedule for 2022 is already filling up nicely, and I definitely want to get some more variation in the sessions I’m presenting.

An overview of the sessions I presented in 2021.


I managed to write 6 (including this one) blog posts in the past year, and this is definitely where I want to put more focus on in the next year. I’d like to have supporting blog posts and Jupyter Notebooks for the sessions I’m presenting, where I can provide more context and explanation for certain topics. And, instead of only writing down my random discoveries in OneNote, I could definitely create blog posts out of those, to have some more reading material in the future. Who knows, I may even publish a blog post under 1000 words this time 😂.

Our local region

Probably the most important one of all, I want to make sure our local community talent gets the chances they deserve. We’ve got some incredibly talented speakers in Belgium, whom I think will do great in the future. I’m not sure yet and the what, when, and how, but I do want to make sure they get some extra exposure, and help if they need it.

Community – dataMinds

For our user group, 2021 was also a curious year. Our ‘normal’ planning is to hold about 10 evening sessions in the September – June timeframe, send out a monthly newsletter + round-up in that same timeframe, plus organize a free Saturday event in March and dataMinds Connect in October. Then, we sometimes decide to opt in for Global Bootcamps, but it’s not necessarily part of the plan at the start. In 2021, we did put together a nice mix of speakers and topics, and can look back with joy on the attendance we got in these weird times.

An overview of the dataMinds evening sessions of 2021.

For the Saturday event in March, we unanimously agreed that if we were putting this together, it had to be a distinction from all the other virtual events that have popped up lately. We did notice that the representation of speakers from our local area (Belgium, Netherlands, and Luxemburg) in these online events was fairly low, with mostly the same names coming back. For that exact reason, we branded our first dataMinds Saturday as the BeNeLux edition, and only wanted to schedule speakers residing in this local region. We managed to get some new speakers launched, and this was so rewarding to watch.

Then, our crew agreed we were not looking forward to another virtual edition of dataMinds Connect, and decided to move along with planning for a ‘as normal as possible in-person’ edition in October. Come May 2021 we made the decision to move forward, and get the wheels in motion, with constant decision gates based on external factors.

It wasn’t easy getting this edition planned and executed, but I’m so happy we managed to get it done in a responsible fashion. Personally, I got so much energy out of having conversations with loads of people at the venue, and am glad we pulled it off.

We’re cautiously considering our next steps for 2022, and specifically the 15th anniversary of dataMinds Connect (formerly known as SQL Server Days Belgium), and I’m looking forward to keep contributing to these activities. And, we already have a great line-up of evening sessions planned for Q1 2022, with plenty of ideas for more sessions to come!


2021 has been a wild year for me where I kept getting reminded with force that I’m not 18 anymore, and need to make myself take some breaks. I love doing a variety of activities that can be related to the tech stuff I do, or beyond. But enough is enough, and I really need to pace myself. Pace myself, and take the time to better process some things, which will help me out in the long run.

For 2022, I’ve got some cool things outside of tech lined up, that I’m cautiously preparing for. Here’s to hoping I get to see these plans through! I’m rooting for our world to go back to a less erratic situation, and to get back out there.

Wrapping up

As I mentioned in the intro, I’m so fortunate and don’t have any major reasons to complain. My heartfelt wishes to you and those close to you.

May 2022 bring you everything you deserve.

1 Comment

T-SQL Tuesday #145 – The Pandemic, Costa Rica, and Events

It’s T-SQL Tuesday!!

T-SQL Tuesday #134

T-SQL Tuesday #145

T-SQL Tuesday is the brainchild of Adam Machanic (Blog | Twitter). December 2009 was the first T-SQL Tuesday invitation that went out by Adam. It is a monthly blog party on the second Tuesday of each month. Currently, Steve Jones (Blog | Twitter) organises the event and maintains a website with all previous posts. Everyone is welcome to participate in this monthly blog post.

The Ask

This month’s T-SQL Tuesday is hosted by Xavier Morera ( Blog | Twitter | LinkedIn ). Xavier says: “How much do you love meeting in person, where would you like for your next event to take place, and why Costa Rica?”

The original post is here.

My contribution

1. Which is your favorite conference and why?

Over the years, I’ve attended a variety of events, ranging from really small ones (40 people) to fairly large ones (12.000 people). I’ve visited larger ones like PASS Summit and SQLBits, and thoroughly enjoyed them, but the whole experience felt a bit draining overall. There’s constantly new people popping for conversations, too many things happening at the same time, and way too many things I’d like to be doing at that same time. And, I’m notoriously bad at hiding in a crowd, as people always seem to find me quickly.

For me, I prefer the smaller events where you can have some good conversations, enjoy some quiet if you want to, and where there’s plenty of room for some friendly banter with other attendees, speakers, volunteers and sponsors. When I thought about this, there were two examples that first sprung to mind being DataGrillen and Power BI Next Step. Being completely transparent, I attend events for the ‘hallway track’ and have chats with people I otherwise wouldn’t run into. Smaller events are just easier to achieve this goal, with or without certain types of beverages 😊

You’ve probably noticed all the events I refer to are in-person events. I do not dislike online events, but they just don’t give me the same level of satisfaction as speakers, organiser, and attendee. I understand the place they have in our current zeitgeist, but there’s options I prefer better.

2. Which is the best venue that you have visited for a tech conference?

I’ll be very chauvinistic here. The venue we had for dataMinds Connect in 2017 and 2018 was pretty cool and got actual good feedback, despite the session rooms having some sound & tech issues. Back then, we visited the Ghelamco Arena in Ghent, which houses KAA Gent.

The view we had from the large rooms overviewing the soccer pitch was pretty cool, but I particularly liked the room we had in the actual Press Box of KAA Gent. This is where Klaas Vandenberghe ( twitter | linkedin) , Chrissy LeMaire ( twitter | linkedin ) and Rob Sewell ( twitter | linkedin ) held their dbatools precon session, but decided to turn it into a “press release”, because of obvious reasons. After 2 years, we ended up moving away from this venue, for multiple reasons, and going to the Lamot venue in Mechelen. But still, if you’re asking coolest venue, this one really springs to mind for me.

The Ghelamco Arena Pitch being cared for, during the conference.

Klaas Vandenberghe, Chrissy LeMaire and Rob Sewell hosting the dbatools Press Conference, ehrrr precon session.

3. Who is the best presenter that you have ever listened to?

I’ll split this into two parts, being online and in-person sessions, because they are completely different experiences to me. These days, you’ll rarely see me attend online sessions where I’m not moderating or presenting myself. I have a notoriously bad attention span (Squirrel!), and I focus all my attention budget on the calls I do during my daytime occupation, whereas I find it very difficult to stay attentive after hours.

There’s a handful of online presenters I’ve witnessed so far that actually keep me zoned in to their entire session. To single out one, I’ll have to go with Alexander Arvidsson ( twitter | linkedin ). Alexander has a certain way of telling a story, and using his set of tools, that keeps me drawn in. For this, I can only tip my hat. If you’re not completely sure about what I mean, I suggest you take a look at his session called “The Untruthful Art – Five Ways of Misrepresenting Data” ( youtube ), which resonated particularly well with me.

For in-person sessions, I can only put forward one name, and be a bit chauvinistic again. I’ve lost count of how many times I’ve seen this person present, and am completely amazed at the breadth and depth of topics he’s covered throughout the years. In the 10+ years I’ve attended sessions by Nico Jacobs ( twitter | linkedin ), I’ve never walked away without learning something new and interesting, even if it were a session on a topic I considered myself to be proficient at. Over years past, we’ve jokingly called Nico our ‘joker’, as we can always can call upon him to fill a gap for a specific topic we’re looking for. He’ll probably have some materials and demo’s good to go anyway..

4. Which location would you like for your next event to take place and why Costa Rica?

There’s plenty of places in the world I’ve yet to discover, and if events are reasons for me to go there, I’m all on board! Someday, I’d like to present at conferences in the US & Canada, as I’ve been told there’s pretty interesting differences in the complete experience. Costa Rica sounds like a very nice place to go to, but only if I can extend the stay by 2 weeks to visit the country, and go train with an old friend of mine at his local dojo.

For reasons outside of tech, I’m dreaming about doing a tour of Japan and visit the cities like Osaka, Kyoto, Nagasaki, Tokyo, .. Visit those cities, but also go train at the different Shinkyokushin Dojos spread throughout the country, and hopefully attend the World Cup of Shinkyokushin Karate one day. If an event in Japan would help me achieve those goals, I’ll grab the opportunity with both hands 😊.


All the reminiscing about times past is fun. But I’m about ready to start doing the real thing again. Here’s to hoping we can pursue those dreams in the near future, and keep it safe for everyone ..
It’s a quick write-up, as I looked over the invite. But still, I wanted to contribute to this one!

Stay safe, take care!

Leave a comment

PowerBIQuiz: APIs & PowerShell

Wednesday past (March 24th 2021), I had the wonderful pleasure of appearing as a co-host on the bi-weekly PowerBI Quiz by Just Blindbaek, a Danish fellow Data Platform MVP. We’re nearing the end of Season 3, and though I haven’t been able to chime in every time, or score the way I wanted score, I still have a blast every time I play this. There’s a large number of returning faces every time, and we keep each other on our toes, and have tons of fun whilst doing so.

When Just asked me a few weeks back if I wanted to host a topic, and I immediately jumped to this opportunity. I’ve been doing some work around APIs and PowerShell, which was a topic I dreaded before. Hence, this was an excellent topic to test the knowledge of the Power BI Quiz participants. All questions are aimed at a governance perspective, and how the APIs/PowerShell can be a useful tool for helping things stay afloat.

Whilst making these questions, I also realised it would be an excellent blog post to explain the questions and answers in a blog post afterwards, to have a better explanation of my reasoning. As well, I learned it’s harder than expected to create questions that fall in the not too hard, not too easy sweet spot for a quiz. This makes me respect the other quiz makers even more!

The recording of the Power BI Quiz can be found below, or you navigate through the Power BI Quiz website


The Questions and Answers

1. The Power BI Activity Log can return data for the last ..

  • A – 90 days
  • B – 30 days
  • C – 45 days
  • D – 60 days

Answer: B
Explanation: The Power BI Activity Log will only return day for the last 30 days, which is different than the Office 365 Audit Logs through the Security & Compliance Centre, which will return the last 90 days. You’ll have to “sacrifice” those extra 60 days of possible history, but in return you get a more stable API (in my opinion), and less required privileges to extract that data.

Meaning, you only need to have permissions on the Power BI side (Power BI Admin Role specifically), and no roles in the Security & Compliance Center of Office 365 (At least viewer role). I have had some organisations where this was a definite no-go, as this would mean they could extract the logs for all Office 365 components.

Microsoft Docs page outlining the differences

2. The Export to File API will work with workspaces using

  • A – An A1 Sku (or higher)
  • B – Power BI Pro
  • C – Power BI Free
  • D – A P1 Sku (or higher)

Answer: A, D
Explanation: The Export to File API will export your Power BI Reports or Paginated Reports to file format of your choice. Based on the capacity you have backing up these requests, you’ll get a higher concurrency rate for processing these reports. In the background, it’s essentially doing the print/save as options you can manually, but more optimised for bulk usage.

Some things to note .. Right now, you can technically get it to work with a PPU license, but throttling will hit you fast ‘n hard. If this behaviour will persist after General Availability on April 2nd, I can’t tell. An A1 Sku (or higher) is intended for external (outside of your own organisation) embedding scenarios, and can help you get started with this scenario really quickly.

Export to File API Explained
Export to File API Reference Page

3. Power BI dataflows can be created/copied through API calls

  • A – TRUE
  • B – FALSE

Answer: A

Explanation: Yes, it’s possible! But it’s not easy 😃. This process is based on the Import APIs, which lets you upload .pbix files for a report, or the .json files for dataflows. While this sounds easy enough, there’s a few hoops you have to jump through, which I never got working on my own. Luckily, a few community members have posted working options and scripts to do this, and now there’s even an External Tool for it.

Marc Lelijveld – Move dataflows across workspaces
Marcus Wegener (German post, but it translates well) – Export to dataflow

4. To determine if a .pbix file uses DirectQuery through APIs, I can

  • A – use Report Information
  • B – use Dataset Information
  • C – use Datasource Information
  • D – use DirectQuery RefreshSchedule

Answer: B, C, D


When using the Dataset Information (GetGroupsAsAdmin, with a $expand on datasets), it will return a field called ‘ContentProviderType’, which displays the connection mode for the .pbix file. For those files that run DirectQuery, you’ll see a value ‘PbixInDirectQueryMode’.

When using the Datasource Information (GetDataSourcesAsAdmin), there’s 2 fields called Name and ConnectionString. Based on the tests I did on my environments, they only returned values when DirectQuery is being used. Would I trust this completely to base myself on for all scenarios? Definitely not, but it’s something!

When using the DirectQuery RefreshSchedule, you can call this API for every dataset. The ones that actually return a schedule are either DirectQuery or LiveConnection. Based on the DataSourceType (pretty much exclude Power BI / Analysis Services), the returned records are datasets which have the default behaviour for caching enabled. Since you have to loop over all your datasets individually, you have a high risk of running into limits on this one, and an additional chance that someone disabled the caching behaviour.

To conclude this, the most durable solution I found there was to use the Dataset Information, but this has not yet been tested with Composite Models v1, or Direct Query over Azure Analysis Services and Power BI Datasets.

5. “GetScanResult” is the only API that returns Dataset endorsements

  • A – TRUE
  • B – FALSE

Answer: A

Explanation: At the time of writing (and airing), this is definitely the case! GetScanResult is part of the asynchronous mechanism to incrementally fetch all the lineage information on your Power BI Tenant. Essentially it’s the same that’s happening under the covers of Azure Purview, and this allows you to build your own solution.

To successfully do this, you need to handle the calls for modified workspaces, poll for the request state, and then get the results back. If you’re fairly new to APIs & PowerShell, this is definitely daunting. Luckily, there’s community resources out the there to help us get started.

Just Thorning Blindbaek – Extracting Power BI Metadata with Azure Data Factory

6. Using a Service Principal (SPN) I can use these read-only Admin APIs

  • A – GetDatasetsAsAdmin (Datasets)
  • B – AddUserAsAdmin (Groups)
  • C – GetDatasourcesAsAdmin (Datasets)
  • D – GetCapacitiesAsAdmin (Capacities)

Answer: C, D

Explanation: Read-only Admin APIs and Service Principal authentication were only announced in December 2020, but they made a really big difference for the type of things we’re doing. They allow us to use an Azure AD App Registration, and extract the metadata details on our Power BI Tenant we need, fully unattended! Right now, there’s only an initial batch of APIs that are allowed for SPN authentication, but I expect more to arrive over time.

The only thing I find a bit wonky right now is that we can call GetDataSourcesAsAdmin, which requires a DatasetID. Yet we can not call GetDatasetsAsAdmin to help us get started. Meaning, if we want to iterate over datasets unattended, we have to call a different API (such as GetGroupsAsAdmin, $expand datasets, or GetDatasets) to help us get started. Using the GetCapacitiesAsAdmin, we can return all the info we need on the capacities we have in our organisation.

AddUsersAsAdmin is not allowed, which makes sense as it’s technically not a read-only Admin API.

Microsoft Docs Page that explains Read-only Admin APIs with Service Principal Authentication

7. Using “GetGroupsAsAdmin” I can use the $Expand parameter for

  • A – datasets
  • B – datasources
  • C – users
  • D – apps

Answer: A, C

Explanation: GetGroupsAsAdmin is my bread and butter for getting most details out of my Power BI Tenant, especially in combination with $Expand. Before this existed, we had to individually loop over all the object (nested calls most of the time) to return the required results. For larger tenants, this meant bumping into API Limits all .. the .. time .. The GetGroupsAsAdmin API is called once for every workspace, and will return all the specified information in a single go. There’s a limit for 5000 workspaces per call, but you can work your way around this by intelligently looping over this.

The $expand parameter can be used for users, reports, dashboards, datasets, dataflows, workbooks. Meaning datasources and apps are not a part of this.

GetGroupsAsAdmin with datasets expanded
GetGroupsAsAdmin with users expanded (only for v2 / new workspaces!)

8. I can use the (Datasets) UpdateDatasources API to change a parametrized Connection String

  • A – TRUE
  • B – FALSE

Answer: B

Explanation: We can use API calls to alter connection strings, for instance to change between DEV/PRD environments, or perform a migration. One of those options is using the UpdateDatasources API, to do so. When using DirectQuery datasources, I’ve had some real issues when using a parametrized ConnectionString, as it would not allow it. When reading the Docs page, it actually does outline this a restriction, and tells us to use the UpdateParameters API to change the parameters, and not the connection string. In full transparency, I’d look at Rebind Report operations, or External Tools to assist in this process, as I had some kinks to work out when doing this.

9. To safely unassign a Workspace from a Capacity using APIs I can

  • A – Delete the workspace, it can not be unassigned through the API
  • B – Use “CapacityAssignmentStatus” with -UnAssign and Workspace GUID
  • C – Use “UnassignFromCapacity” with the workspace GUID
  • D – Use “AssignToCapacity” with an empty GUID (0000..00) for CapacityID

Answer: D

Explanation: Off the bat, let me emphasise on the word “safely” in the question. When deleting a workspace it’s no longer assigned to the workspace, but we also lost the content that was included in this..

The only real solution to this is using the AssignToCapacity API, with an empty GUID (00000000-0000-0000-0000-000000000000), and thus assigning to a default capacity (the shared one for Power BI Pro). CapacityAssignmentStatus is an actual API, but it’s only used for getting the status reports on certain workspaces, when making the switch.

UnassignFromCapacity does not exist as an official API, and is something I completely made up 😃

10. To inventarise who CAN use PPU (Premium Per User) features, I have to:

  • A – Check the Power BI Activity Log
  • B – Refer to the PPU Tab in the Power BI Admin Portal
  • C – Grab licensing information (ie. through Graph API)
  • D – Check Power BI Workspace information

Answer: A, C, D

Explanation: First off, I have to explain ‘inventarise’, as it caused some confusion during the quiz. My meaning for this is the assemble a list of users who have the licenses and workspace access to perform PPU activities. To get the license information, we have to look at 2 different sources. The easiest one is the licenses that were assigned by a License or Power BI Admin, and these be extracted through the Graph API for instance. The hardest one is the In-Product Trial Experience, where users opt-in for the paid trial and get a free 60 day access pass to take it for a spin. The only way we can extract that information is a part of the Power BI Activity Log (or Office 365 Audit Log). Specifically we need to look for the ‘OptInForProTrial’, and ‘OptInForExtendedProTrial’ actions.

Then, we need to look at workspaces that are assigned to the Premium Per User reserved capacity, which we can do with the GetGroupsAsAdmin API, preferred with the $expand on users, to return the access list for the workspace. Luckily, I made a write-up on this process a while back, that holds some information on what you can do to prevent it, and steps to figure out who’s doing it.

One remark, which was stressed by Sir Saxton of Cubes as well, is the essential purpose of storing the Activity Logs as soon as you can. Due to the limited retention (30 days) you can not go back in time to when these licenses were made available. This is just one of the use cases where the Activity Log comes to the rescue, and I run into different uses on a daily basis. If there’s anything you should learn from this, it’s to get that extraction and retention up and running, ASAP!

Power BI Premium Per User: Who’s using it in your tenant?


I had lots of preparing the questions, and co-hosting the quiz with Just. I’m hoping people have learned something new on this, and that they’ll dabble into some of these things themselves as well. But, why not join us for the real deal next time?
Next time we’re doing the Power BI Quiz is on Wednesday April 7th at 8PM UTC+1, with the co-host to be announced

Subscribe to Just’s YouTube channel, and you’ll get the notifications for it as well

, , , ,

Leave a comment

T-SQL Tuesday #135 – My Tools for the Trade

It’s T-SQL Tuesday!!

T-SQL Tuesday #134

T-SQL Tuesday #135

After making my first contribution to T-SQL Tuesday last month, I figured I couldn’t stay behind when it’s a topic I have loads to share on.

T-SQL Tuesday is the brainchild of Adam Machanic (Blog | Twitter). December 2009 was the first T-SQL Tuesday invitation that went out by Adam. It is a monthly blog party on the second Tuesday of each month. Currently, Steve Jones (Blog | Twitter) organises the event and maintains a website with all previous posts. Everyone is welcome to participate in this monthly blog post.

The Ask

This month’s T-SQL Tuesday is hosted by Mikey Bronowski( Blog | Twitter ). Mikey says: “Without tools, most of the work would be much harder to do or could not be done at all. Write a blog post about the most helpful and effective tools you use or know of.”

The original post is here.

My contribution

Which tools are essential to my working day?

When thinking about which tools I use most often, I quickly drew the conclusion that my job content changed drastically over the past few months. Yes, some spiky microscopic creature causing havoc with a global pandemic naturally contributes to this, but it’s also the nature of projects I’ve been taking on lately. These days, I spend the majority of my time in Microsoft Teams, hopping through tenants & accounts, and jumping from one call/conversation to the other. And yes, I’ve lost many moments of my life figuring out in what channel I was at exactly, and who I was supposed to be talking to.

Microsoft Teams, Edge and Outlook

Bluntly put, Teams has some room for improvement to allow users to easily change between tenants and accounts, and without wreaking havoc on the related Azure ADs. Right now, my best working solution is that I have a Microsoft Edge-profile setup for every customer, and other types of organisational accounts I have (ie. dataMinds.be). Every single one of these profiles is synced in Edge, and tied into my LastPass vault. Each of those profiles usually has a Teams Tenant coupled to it, that I install as an Edge Desktop Application. It works pretty smooth, for most of the use cases you might have. The main reason for switching over to the ‘fat client’ of Teams is when I need to take over screens in a call, or to go dig deep into settings windows. Marc Lelijveld has written an excellent write-up on how to set this up, although I didn’t go as far as creating custom icons for each single one.

One thing I’ve also started doing for some clients (the ones I work for on a very regular basis), is to set up e-mail forwarding from that customer account to my own work account, with an extra incoming Outlook rule to triage that to a separate folder immediately. It saves me a lot of time when I quickly need to check if something has been sent to that account, rather than jumping through the hoops of VPN, Multi-factor authentication, etc. Most clients remember to include my work account if they want a quick response from me, but it still helps to track down some things that I may otherwise miss.

The Office Suite

When I mentioned my job contents changed a lot, it also means that I find myself more in the ‘writing documents’ part of the job. Writing out assessment, audit, governance, presales, .. documents is an important part of what I do these days. So yes, my best friends are Word, Excel, PowerPoint and OneNote. Each has their specific use in what I need to do, and over time I’ve built a whole library of stuff I can reuse. OneNote has definitely grown on me, and it’s now an inevitable part of my process.

And do I do some ‘actual’ work?

Define ‘actual’ work 😉. I still get plenty of opportunities to build out some technical stuff, or help some colleagues/clients when they’re in a jam. For the sake of time, I’ll just limit myself to some of the Power BI things I do, and leave out the rest of the Microsoft Data Platform stack. For once, I’d like to publish a blog post under 3000 words 😊.

When External Tools were released (release blog) to the general audience in July 2020, a shock wave hit the BI landscape. Before, a lot of things were ‘kinda’ possible, but they mostly resulted into going on unsupported terrain, and potentially tricky results. All fun and games when you’re a more technical person trying to keep up with things, but definitely nothing to hand over to a client that is learning their first steps with Power BI.

The External Tools (and the Enhanced Metadata format enabling it) allow end users of Power BI Desktop to call on custom built applications, scripts, .. to augment their developer/designer experience. These days, there’s over 40 (I stopped counting) external tools available, each with their own use case and focal area. When showing off some of the capabilities to my clients, it amazes me to see how quickly they pick up these things, and start building out their own ways of working.

Depending on the client, their IT Compliancy rules, the business and technical requirements, my actual tool belt tends to vary. Not every IT organisation allows user to freely install an application, digitally signed or not, so this is definitely an important one to take into your conversations early on.
In a nutshell, my possible weapons of choice are:

  • Power BI Desktop & SQL Server Management Studio
  • Tabular Editor: My main go to when I need some more flexibility for ie. setting up calculation groups, content for documentation, advanced scripting, mass editing objects, .. This allows
    • PowerBI.tips has a 4-part series with Daniel Otykier, the main developer.
  • DAX Studio: Whenever I need to write a more complex DAX measure, or go digging into how DAX measures are performing against my model, DAX Studio comes to my aid. Simply put, this has saved me many hours when figuring out things.
    • PowerBI.tips has a 2-part series with Darren Gosbell and Marco Russo (Part 1, Part 2)
    • SQLBI.com has a full playlist on DAX Studio and Vertipaq Analyzer
  • Vertipaq Analyzer (Excel): Yes, I’m aware DAX Studio holds a version of Vertipaq Analyzer. Yet, the Excel version allows you to go a bit more into detail on encoding specifics etc.
  • ALM Toolkit: When working with Incremental Refresh or larger models, ALM Toolkit allows you to compare and publish metadata changes to the Power BI Service. Deploying datasets without needing to do a full refresh every single time, that’s where the magic is at ♥
    • PowerBI.tips held a webinar with Christian Wade on ALM Toolkit
  • Power BI Helper: Reza Rad has put incredible effort into this tool,
  • Power BI Cleaner: Lightweight solution by Imke Feldmann (The BIccountant) that checks which fields are used in your .pbix file.
  • Power BI Sentinel: PBI Sentinel is a paid tool, but a downright impressive one. The sheer fact that they are able to capture a tremendous amount of information on how Power BI is used in your organisation can be a YUUUGE! timesaver. On top of that, they’re able to perform table and column level lineage for some data sources in Power BI. The fact you can’t even do this (yet) with the tools at hand by Microsoft, is a very impressive feat.
    • Reid Havens held a livestream with Alex Whittles in December 2020, going through some of the most important features.
  • Power BI Field Finder v2: Lightweight solution by Stephanie Bruno to help analyze how a Power BI file is constructed, and how visualisations are used.
  • DAX Beautifier: 1 click, 1 call to DAXFormatter.com to format every piece of DAX code you have in your model. Because yes, my eyes burn when I see poorly formatted code 😂

Which tools am I going to add to my toolbelt? (Soon, I promise ..)

After seeing some folks like Julie Koesmarno (twitter), Aaron Nelson (twitter) and Rob Sewell (twitter) show off Notebooks in Azure Data Studio, I’m keen in rewriting some of my scripts and processes into a fancy Jupyter Notebook. I’m convinced this will really help me in some of the assessments I do, and easily share my work and results. Other than that, it’s fancy toys I want to try out for myself.

Something I’m looking improve upon fairly soon is my remote whiteboarding setup. When in workshop meetings where I had actual people in the same room, I usually ended up at a whiteboard to quickly draw out some things. For me, this is still the hardest thing to adapt to in our remote way of working we have these days. I’m currently digging into some options for external drawing pads, to allow these to sync to the Teams Meeting I’m in at that time. Yes, my drawing skills will still be terrible, but it’ll be huge improvement of what I’m doing now.

I’ve been trying out some options for a to do list, and I’ve not found one that actually works for me. Currently I’m experimenting with Microsoft To Do, but it’s not catching on as I expected. For some odd reason, I keep ending back up at the pieces of paper that are always lying in front of me at my desk. The physical act of writing it down helps me to remember it best, which has been an issue with some of the tools I’ve tried. Who knows, I may even try to build out an actual kanban board on my wall as a next experiment ..


Writing out my train of thoughts made me realise I use plenty of different tools, as there’s still quite a few I haven’t touched upon. I’ll need to reflect on how I’m using most of them, and if it’s to their proposed strengths. If there’s option for improving my process, it’ll definitely be worth it. But, I’m looking forward to reading some other posts and see what other gems people are using.

Stay safe, take care!


Speaking At : Data Event Vienna 2021 (SQLSaturday #1015)

Speaking At : Data Event Vienna 2021 (SQLSaturday #1015), January 15th 2021

This Friday, I’m coming out of my hibernation for presenting at remote events, and it’s a special one. The final SQL Saturday as we know it will be held virtually in Vienna, on Friday January 15th 2021. PASS is dissolving that same day, and the future of the SQL Saturday brand is unsure. SQL Saturday has been a very important part of my community engagements throughout the past years. From attending in Utrecht, to speaking at my first SQL Saturday in Munich, to helping to organise our own SQL Saturday Belgium. It’s been one heck of a ride, to say the very least.

I’ll be talking about Impactful Data Visualisations, and some things you can keep in mind to design them (Session Details). Kicking off at 10:15AM, you can find me in the “Power BI & Power Platform 1” room. Apart from my usual ramblings, there’ll be a stellar lineup with interesting topics to cater to your every needs. Registration is free, and open until Friday. Head over to their registration page, to join in on the fun!

SQL Saturday Vienna (1015)

Event Link : SQLSaturday #1015 – Vienna 2021 (Remote)
Event Date : Friday January 15th 2021
Session Time : 10:15 – 11:15 (UTC +1 / GMT)
Session : Designing impactful visualisations for your data

, ,


T-SQL Tuesday #134 – Give me a break!

It’s T-SQL Tuesday!!

T-SQL Tuesday #134

T-SQL Tuesday #134

This is actually the first time I’m contributing to T-SQL Tuesday, after having read many of the entries in the years before.

T-SQL Tuesday is the brainchild of Adam Machanic (Blog | Twitter). December 2009 was the first T-SQL Tuesday invitation that went out by Adam. It is a monthly blog party on the second Tuesday of each month. Currently, Steve Jones (Blog | Twitter) organises the event and maintains a website with all previous posts. Everyone is welcome to participate in this monthly blog post.

The Ask

This month’s T-SQL Tuesday is hosted by James McGillivray ( Blog | Twitter ). James wants to know how we’re managing to give ourselves some breaks, to keep ourselves from going even more bonkers.

The original post is here.

What do you do to take a break when you’re stuck at home?

Arguably so, I’ve always been terrible at setting aside time for breaks when working on my own. When working at a customer site, or in the office, things flowed a bit more naturally when grabbing a coffee, and having a chat. The reality of the past few months has been that I’ve been stuck behind my desk for hours on end, mostly being dragged into numerous Teams Meetings or Zoom Calls. I quickly realised that I needed to get this under control, to be able to last.

These days, my break times are mostly consumed by the doggo of the house, Pixie. She’s a nearly 2 years old Briard, who have a heritage as French shepherd dogs. Long story short, these dogs are incredibly active and fun to have around. Most days, I take her for a walk around the block before getting started. Then, during one of my coffee breaks, I go out in the garden and kick around some of the Jolly Balls I bought for her. At times like these, I can tell that she’s got some frustrations of her own to kill as well.

Playing with Pixie in the garden

Playing with Pixie in the garden

But in the weekends, when I have some more time on my hands, we usually go out for some bigger walks of up to 10-12 in the woods near where I live. This gives me some more time to clear my head, and gives the doggo some direly needed attention. Especially with the weather being a bit colder and some more rain, it’s mostly her diving head first into the first puddle she sees, and then continuing to do so for the rest of the walk. All good fun, but a long haired dog takes a looooong time to dry off :).

Walking the dog at Drieboomkesberg

Walking the dog at Drieboomkesberg

Pixie refusing to come out of a puddle

Pixie discovering some local water

Then, at evening times, I try to switch between doing some reading or studying for things I’m working on, or simply blowing up stuff in a video game. I’m keeping my sh*t together, but I’m frantically counting down the days for when I can go back to my regular activities to get some peace of mind.

In normal times, I’d have a Shin Kyokushin Karate training 2 – 3 times a week, depending on my schedule for that week. It’s a full contact sport, which means it’s physically demanding and exhausting, and I’ve never ever even had as much as a lost thought on stuff I was working on.
Ahhh, to be able to hit some people again 😃

If money was no issue, what would be your bucket list vacation?

I’d have to split it into two, and I’d be torn if I ever had to choose between those two.

I’ve always wanted to go to Japan to visit some of the heritage sites and the Honbu Dojo. Preferably in the same time span as the World Championship to be able to witness the insane atmosphere in the Tokyo Dome with 15000 spectators watching some of the greatest kumite matches our sport has to offer. Followed by a tour of some cities like Kyoto, Nagasaki, .. to make this a splendid trip with some of my karate friends.

Then, competely on the other end of the spectrum, and I’m doubtful that I’ll ever be able to do this. For years, we’ve always got together with some of my Chiro friends (compare it to Boy Scouts, to a certain extent), and do a weekend where we pull the same idiotic stunts we used to do when we were younger. We’re all a bit older now, and many of us settling in to work/life situations. Getting together isn’t as easy as it used to be. For that reason, I’d be thrilled if we were able to get away for a week or so, and go camping with the boys only. Pure nostalgia, to some of the most fond memories I have.


I’m managing to keep things afloat, but I’m ready for things to slowly get back to normal. Especially because I’ve not been able to go the practice 2-3 times a week, I’ve felt some built up frustrations that needed to get out.
Here’s to hoping we can all reconvene soon, and get ready for some more epic stories!

Stay safe, take care!

1 Comment

Power BI Premium Per User – Who’s using it in your tenant?

Before digging into this, let’s be clear about 1 thing. When Power BI Premium Per User (PPU for short) and Power BI Premium v2 were announced at Ignite 2020, the collective user base of Power BI rejoiced, and I am happy to be among them. PPU is going to be an excellent addition to leverage Premium features, without having to pay for an entire P-sku, or deal with spinning up an A-sku. Left, right, and center, we’re seeing interesting new use cases pop up to make the most out of PowerBI. And I .. LOVE IT!

I do have some concerns right now, which will likely be taken away as we get closer to General Availability. An obvious one is that as of now (January 2021), there’s no public details about the pricing. By default, users can assign themselves a PPU license, and start building things as they see fit. Self-Service BI is great, especially when there’s some guard rails in place, and when a Data Culture is actively stimulated. Matthew Roche’s series, goes into this in a splendid fashion.

The other day, I was chatting with one of my clients about Premium Per User, and I gave them the practical guidance to not build any production level dependencies based on PPU features or workspaces, until some of the unknowns have been cleared up. If there’s end users relying on this for their actual daily job, then I’m calling it a production level dependency. Right now, these are preview features, and this client is not actively monitoring changes in the Power BI Landscape.

Shortly after, I got a message that some of their business users did build actual production reports and dataflows in PPU workspaces. And, they were not sure who in the company actually has access to PPU. And that’s where chase down the rabbit hole began 😃

Who can use PPU Features in my tenant?

By default, every user will be able to assign themselves a PPU trial license, and start experimenting. Users can also be assigned a PPU License through the M365 Admin Center, as outlined in the Power BI Premium per user public preview now available post by Chris Finlan.

This behaviour can be allowed, disabled, or scoped to a specific group of users. By default it will look the same, as shown in the image below.

Tenant Setting to allow users to try paid features

Tenant Setting to allow users to try paid features

In combination with the tenant setting for who can create workspaces, this will control the PPU experience, and how freely users can create experiences for themselves. By default, every user in your organisation can create new workspaces, and these will automatically be in the New Workspace Experience (v2 Workspaces).

Tenant Settings for Workspace Creation in your tenant

Tenant Setting for Workspace Creation in your tenant

Okay now, but who’s actually doing something with PPU Features in my tenant?

Attempt 1 – List out PPU Workspaces, get Capacity Information

My first thought was to grab an overview of workspaces that are on Dedicated Capacity using the Power BI PowerShell cmdlets (Get-PowerBIWorkspaces & Get-PowerBICapacity), and filter out workspaces that are linked to a PPU Capacity.

Then, for those workspaces, get the Users and/or Groups that have access to them using an InvokeRestMethod to the Admin API (GetGroupsAsAdmin, expand=users)

This works, to a certain extent. A few of the exceptions I thought of so far.

  • Just because a user is in a group that has access, doesn’t mean they actually have a PPU Subscription.
  • A user could have a PPU Subscription, and have left the PPU Workspaces.

Attempt 2 – Get License Information

We can grab the license information through the MS Online cmdlets (Get-MsolUser), AzureAD cmdlets (GetAzureADUserLicenseDetail), or the Graph API (https://graph.microsoft.com/v1.0/me/licensedetails).
The MS Online cmdlets don’t support authenticating through Service Principals, so don’t actually build any dependencies on this.

All these options give me a nice overview of :

  • Power BI Standard (Free) with ServicePlan: BI_AZURE_P0
  • Power BI Pro with ServicePlan : BI_AZURE_P2
  • Power BI Premium Per User with ServicePlan: BI_AZURE_P3
  • Office 365 E5 Subscriptions

My real concern is that this only lists the users that have been assigned this as a (purchased) Product, as part of the “skuPartNumber” PBI_PREMIUM_PER_USER.
Meaning, only the users that were assigned this subscription by an admin will show up this way. Usually, they will be tied to groups as well, which can easily be exported to get an overview.

Right now, I’m still missing the most important group of all, which is those that have assigned themselves an in-product trial version for Premium Per User from within the Power BI Service.
These are the users that are potentially flying under the radar, and exactly the ones that we want to identify

Attempt 3 – Microsoft 365 Admin Center

After being hinted by Jan Pieter Posthuma to read through the Self-service purchase FAQ | Microsoft Docs, there’s a screen where you can get an overview of paid trials. Alas, this isn’t giving me any results I expect to see, and seems to be tailored specifically to M365 products.
My own user has a trial license from within the Power BI Service, and this also doesn’t show up in the personal overview of Subscriptions & Licenses. Which explains to me why it doesn’t show up in the Admin Center either.

Attempt 4 – Track the Power BI Activity Log

Similar to the “OptInForProTrial” and “OptInForExtendedProTrial” activities that appear in the Power BI Activity Log when user assign themselves an in-product trial Power BI Pro Subscription, I was hoping to see the same for Power BI Premium Per User subscriptions. I’ve been using this method to track users getting a Pro Trial, and potentially will be needing an actual paid license within 60 days.

I assigned a test user an in-product trial subscription, and grabbed the activity logs the day after. This activity actually also shows up as “OptInForProTrial”, which made sense as soon I actually read through the post from Chris Finlan again, specifically the How To Get Started, Existing Users section.

Existing Free users – You will be given access to the PPU capabilities during the public preview period if you choose to opt-in to a Pro trial as an added benefit.  Since it will happen behind the scenes, the UI in the portal will still reflect that you have a Pro trial, but you will be able to access all the PPU features.

For the set of customers that have disabled their in-product trial experience, your Microsoft 365 tenant admin can opt-in to a trial experience there.

Going out on a limb, I’m assuming the in-product trial functions the same way for both Pro and PPU subscriptions, and this is why there’s no distinction possible in the User Interface or Activity Logs. Likely we’ll see some changes be made in the future, but I’ve no guarantees to back this up.

Bringing the attempts together

To circle back to an exhaustive overview, I’m rolling with a combination of Attempts 1, 2, and 4.

  • Getting the actual licensing information will give us the details on the users that have been assigned a license by an admin. (Attempt 2)
  • Monitor the Activity Log for “OptInForProTrial” activities

Right now the trial experience is the same for Pro and PPU Subscriptions, which means this is about as exhaustive as the list is going to be.  If you don’t have an extraction set up for Activity Logs, I suggest you do it as soon as possible.

Then, having identified the PPU Workspaces in Attempt 1, we can specifically track these in the Activity Log. If we’re seeing some specific activities in there by certain users, this could point out there’s usage going on beyond testing and development.


Getting this question, I underestimated the steps to get a complete answer, which is why I thought it could prove to be useful for other people. My biggest learning here, was that free users can actually assign themselves a PPU Subscription, but that it looks like a Pro Subscription in the user interface. Definitely something to keep in mind when determining your licensing strategy for Power BI.

Most importantly, this outlines once more why grabbing and storing the Power BI Activity Logs as soon as possible is crucial to understanding the usage in your tenant, and being able to act on that. Looking for an example on how to extract these logs? There’s some examples on Github by Melissa Coates, Alex Powers, or Aaron Nelson which can easily help you get started.

The hardest part to track are the in-product trials, and there’s varying opinions on these. I’ve seen organisations disable the “Allow users to try paid features” tenant setting, where users always have to be assigned a subscription, meaning they have to be available as well.
Additionally, you want to make sure your corporate workflows are fluid and fast enough to process incoming requests for subscriptions.

It’s all about walking the fine line between Self-Service BI and Managed Corporate BI, and finding the strengths your organisations can play to.

Thanks for reading!

, , , ,