Thursday, December 15, 2016

Easily add authorization rules to your Azure App

In the last post we saw how you can configure Authentication for your Azure App using Google.  You can configure Authentication using other providers too like Facebook, Twitter or Microsoft Azure Active Directory.  And the good thing is that you can do this without modifying your deployed application.

In this post I want to show you a new Azure App service feature called Authorization which you can easily configure by just adding authorization.json file to your application root directory.  Let’s take a look at this authorization.json file.  For my simple example I have created as follows:

{
  "routes": [
    {
      "path_prefix": "/",
      "policies": { "unauthenticated_action": "AllowAnonymous" }
    },
    {
      "path_prefix": "/Admin",
      "policies": { "unauthenticated_action": "RedirectToLoginPage" }
    }
  ]
}

First you have a routes property which a collection of url authorization rules.  There are different properties for each rule.  In the above example, the first rule has path_prefix property set as ‘/’ and policy set as “unauthenticated_action” to “AllowAnoymous”. This means when I visit the home page allow anonymous access.  But here is a caveat, this is only going to work if you have “Action to take when request is not authenticated” option set in the azure portal to “Allow Anonymous requests (no action)”.  It was a bit confusing to me because I had this option set to “Login with Google” and it would not allow anonymous access with .json file.  Check out my screenshot of my settings.

image

The next authorization rule in the above json says that when I navigate to /Admin page then redirect me to Login page and in my example it will be navigated to Google’s login page.

The basic schema of this .json file is as follows as shown in this msdn article.

{
  "routes": [
    {
      "http_methods": [ "GET", "POST", "PUT", ... ],
      "path_prefix": "/some/url/prefix",
      "policies": {
        "unauthenticated_action": "AllowAnonymous|RedirectToLoginPage|RejectWith401|RejectWith404"
      }
    },
    ...
  ]
}

You might be interested in knowing what does “http_methods” property does. It says that if a request is “GET” or “PUT” or “POST” then only this rule will be taken into effect for this url.  If you want to restrict access to a particular page only if it is accessed via a “GET” request then you can just specifiy “GET”.

In path prefix you can have wildcards too which is very important if the url can have variable segments. For example, /product/1/edit then you can have path_prefix as /product/*/edit.  For simple web applications this feature works great and I like how turn key solution this is.

Once again for this feature to work the authorization.json file has to be at the root of the application folder.

Tuesday, November 29, 2016

Troubleshooting TF14098 permission denied while deleting a branch

Recently I encountered the error TF14098: Access Denied: User DOMAIN\user needs Checkin permission(s) for $/teamproject/folder/* while delete a TFS branch.  In this post, I want to share experience troubleshooting this issue.  I am using TFS 2015 that is hosted on permises with latest update. I am also a TFS admin with project collection administrator rights. 

When I first saw this error, I was felt that I knew why this happened.  We had two branches DEV and TEST. TEST was branched from DEV.  DEV had folder called Binaries and in it there was a particular .dll which had explicit DENY checkin permissions set for [Project]\Contributors, [Project]Project Administrators, [Collection]\Project Collection Administrators. I thought that since TEST was branched from DEV all those permissions would have been in effect in TEST.  I right clicked on the TEST/Binaries/our.dll from within TFS source control explorer, then click on Advanced and then on Security.

image

After that I made sure that those three permission were set to Allow and nothing was deny.  I was confident that this would work but it did not worked. Again the same error. Access Denied.

Next I opened visual studio developer command prompt in administrator mode and then navigated to the actual workspace folder locally and then ran the command tf permission or tf vc permission as follows:

>tf vc permission $/project/test /recursive | clip

This will copy all the permissions on files in your workspace recursively and copy it to your clipboard which you then can paste inside notepad and do search on deny. I found the Groups that had Deny permission set for that particular file.  I double checked that file’s permission inside TFS source control explorer thinking something might have gone wrong.  I tried deleting the branch again. Same error. Then I deleted just that file to get rid of the permission error.  This time I was super confident that this would fix the issue since there is no file locally or on the server.  I tried again. Same error.  Then I read that in this msdn forum that permissions can even exists on a path even if no file exists on the server.  Which makes sense if you think about it that on the server path could still exists even if there are no file in a branch since a branch only has pointers to the contents in it.  I used the following command to set allow on that file as follows:

>tf vc permission $/project/test/binaries/our.dll /allow:* /group:”[Project]\Contributors”,”[Project]\Project Administrators”

I tried deleting the branch again inside visual studio and then tried checking in again and finally it worked this time.  I hope this helps.

Thursday, November 24, 2016

Adding Google authentication to Azure App Service web app

In this post, I want to show you how quickly you can enable Authentication using Google to web app hosted in azure app service. You can configure other authentication providers also if you wish. The documentation for this feature can be found here.

First of all create a blank asp.net web app and publish it to azure. It can be any web application. For demonstration purposes I have created a .net core web app and hosted on Azure App Service.  Open the App Services blade and open your website blade then click on settings.  Under settings you will see Authentication and Authorization option.  Turn on authentication feature as shown below.

image

After you turn on authentication, for “Action to take when request is not authenticate” select “Log in with Google

image

You can see different authentication providers in the list above. Click on Google.  On the next screen it will ask you Google Authentication Settings i.e. a Client ID and Client Secret.

image

Let’s head over to developer.console.google.com.  You should on the dashboard page and click on Enable API. Then Click on the Google+ API to enable that api. Your screen should have Google+ API like the screenshot below.

image

After you have enabled Google+API, go the Credentails page and Click on Credentials button and click on OAuth Client ID.

image

On Create Client ID page, select Web application.

image

As soon as you click on Create you will be asked to provide following information.  Let’s go over these in detail.

1. Name (provide any name you wish, I used the name of the web app)

2. Under Restrictions – Authorized JavaScript origins, you have to provide different urls that you want to be protected so when someone hits that url they will be presented with Google’s login dialog. You cannot provide wildcards in the url. My url is http://[webappname].azurewebsites.net

3. Under Restrictions – Authorized redirect URIs, you have to provide a redirect url.  What is a redirect url? So after you are successfully authenticated Google doesn’t know where it should redirect you, so you will have to provide a redirect url for your website. Now you cannot provide any url.  It has to be https://[webappname].azurewebsite.net/.auth/login/google/callback.

image

After you save you will be provided with a Client ID and Client Secret. Copy them and paste them in the Azure Portal and save.

image

Navigate to your website and you will be provided with Google login option.  After your are successfully logged in, you will be redirect to your website’s home page.  I think it is very easy to setup but real world apps need more than just one size fits all approach.  For example, what if I wanted anonymous access to home page but authenticated access to the entire application.

You might have questions like,”How do I prevent anybody with a Google account from accessing my app?” Then that piece is upon you to configure in your application.  “How do I know the email of the user and which login provider they used?” These are provided to you as HTTP header attributes as follows:

X-MS-CLIENT-PRINCIPAL-NAME : youremail@gmail.com

X-MS-CLIENT-PRINCIPAL-IDP: google

In Asp.Net Core you can access them as follows

 public IActionResult Index()
        {
            var user = Request.Headers["X-MS-CLIENT-PRINCIPAL-NAME"].ToString();
            if (user == "youremail@gmail.com")
            {
                return View("Index"); //Authorized View
            }
            else
            {
                return View("Error"); //UnAuthorized View
            }
        }

If you want to see more information returned from Google you can access your sites but typing the following url

https://[webappname].azurewebsites.net/.auth/me

This will return a json object with all the information.

Inside Asp.Net 4.6 application you can access this information via Claims already populated. 

Friday, November 18, 2016

How to read Google SpreadSheet using Sheets API v4, .Net and a Service Account

In this post, I want to show how to read google spreadsheet data using google sheets v4 apis in .net via a service account.  First of all you need an active google account and then next head to google developer console and create a project. I created a project called My Project. Next on the dashboard screen click on Enable API.

image

On the following screen click on Google Sheets API link and Sheets API will be Enabled. Next go to Credentials page and then Click on Create Credentials.

image

Click on the third option Service account key to create a service account.

image

On the Create service account key click on New Service Account. Provide a service account name, select a role for your project. I am choosing owner here. Select Key type to be JSON. Finally click on create and the file will be downloaded on your machine.  Know where this file is downloaded we will need this in a later step.

image

On the credentials page, under service account keys you will be able to see the account you created in earlier step.

image

Create a spreadsheet called Employees with one row and two columns and keep a note of the spreadsheet ID.

image

The spreadsheetId can be found from the url of the google spreadsheet as shown below. In the url below the id in the {} bracket. Keep a note of this spreadsheet id we will need this in a later step.

https://docs.google.com/spreadsheets/d/{your-spreadsheet-id}/edit#gid=0

Finally all setup is done. Lets head over to Visual Studio and create a new Console Project.  Install nuget package Google.Sheets.Api.v4 from the nuget package manager. Next create three .cs classes and paste the following code as shown below.  Fix all the references.

1.   GoogleService.cs. This class is responsible for creating a sheetsservice using googlecredential.

 
public class GoogleService
    {

        private readonly string _googleSecretJsonFilePath;
        private readonly string _applicationName;
        private readonly string[] _scopes;

        public GoogleService(string googleSecretJsonFilePath, string applicationName, string[] scopes)
        {
            _googleSecretJsonFilePath = googleSecretJsonFilePath;
            _applicationName = applicationName;
            _scopes = scopes;
        }

        public GoogleCredential GetGoogleCredential()
        {
            GoogleCredential credential;
            using (var stream =
                new FileStream(_googleSecretJsonFilePath, FileMode.Open, FileAccess.Read))
            {

                credential = GoogleCredential.FromStream(stream).CreateScoped(_scopes);
            }
            return credential;
        }

        public SheetsService GetSheetsService()
        {
            var credential = GetGoogleCredential();
            var sheetsService = new SheetsService(new BaseClientService.Initializer()
            {
                HttpClientInitializer = credential,
                ApplicationName = _applicationName,
            });
            return sheetsService;
        }
    }

2.   SpreadSheet.cs. I created a spreadsheet class to store values that we read from the spreadsheet. There is a headerrow and there is rows.  Each spreadsheetrow can have multiple rows. For demo purposes I created 2 columns. We need only two

 
 public class SpreadSheet
    {
        public SpreadSheetRow HeaderRow { get; set; }
        public List<SpreadSheetRow> Rows { get; set; }
    }
    public class SpreadSheetRow
    {
        private readonly IList<Object> _values;
        public SpreadSheetRow(IList<Object> values)
        {
            _values = values;
        }

        public string Value0 => _getValue(0);

        public string Value1 => _getValue(1);
        
        private string _getValue(int columnIndex)
        {
            try
            {
                var s = _values[columnIndex].ToString();
                return s;
            }
            catch (Exception ex)
            {
                return String.Empty;
            }
        }
    }

3. GoogleSpreadSheetReader.cs. This class relies on GoogleService class from step1.  The GetSpreadSheet method accepts parameter spreadSheetId and a range parameter.  I have a spreadsheet in which the first row was header row. If you wanted to read all the rows then you will have to modify this method.

    public class GoogleSpreadSheetReader 
    {
        private readonly SheetsService _sheetService;
        public GoogleSpreadSheetReader(GoogleService googleService)
        {
            _sheetService = googleService.GetSheetsService();
        }
        
        public SpreadSheet GetSpreadSheet(string spreadSheetId,string range)
        { 
            SpreadsheetsResource.ValuesResource.GetRequest request = _sheetService.Spreadsheets.Values.Get(spreadSheetId, range);

            ValueRange response = request.Execute();
            IList<IList<Object>> values = response.Values;
            var rows = new List<SpreadSheetRow>();
            for (int i = 1; i < values.Count; i++)
            {
                var row = new SpreadSheetRow(values[i]);
                rows.Add(row);
            }
            var headerRow = new SpreadSheetRow(values[0]);
            var spreadSheet = new SpreadSheet();
            spreadSheet.HeaderRow = headerRow;
            spreadSheet.Rows = new List<SpreadSheetRow>();
            spreadSheet.Rows.AddRange(rows);
            return spreadSheet;
        }
    }

4. Remember the JSON file we downloaded from Google put that file inside a folder called GoogleSecret inside your solution as shown below.  Right click on the the file and hit F4 to view properties and change Copy to Output Directory to Copy Always.

image

image

5. Putting it all together inside Program.cs together to finally read data from Google SpreadSheet.

  class Program
    {
        static void Main(string[] args)
        {   
            Console.WriteLine(System.AppDomain.CurrentDomain.BaseDirectory.ToString());
            var googleSecretJsonFilePath = $"{System.AppDomain.CurrentDomain.BaseDirectory}\\GoogleSecret\\GoogleSecret.json";
            var applicationName = "My Project";
            string[] scopes = { SheetsService.Scope.SpreadsheetsReadonly };

            var googleService = new GoogleService(googleSecretJsonFilePath, applicationName, scopes);

            var spreadSheetId = "your-spreadsheet-id";
            var range = "A:B";

            var reader = new GoogleSpreadSheetReader(googleService);
            var spreadSheet = reader.GetSpreadSheet(spreadSheetId, range);

            Console.WriteLine(JsonConvert.SerializeObject(spreadSheet.Rows));
            Console.Read();
        }
    }

Explanation of the above code. First we get hold of the JSON file path from the Bin directory then provide the name of your application, I called mine “My Project”.  Using scopes, we mention what level of access we have eg. SpreadsheetsReadonly.  We create a new instance of GoogleService then pass googleservice into GoogleSpreadSheetReader.  In the Employees spreadsheet we are using only two columns so we provide a range as A : B.  The range option is very interesting and you can get crazy with these ranges. Finally you get a spreadsheet by calling reader.GetSpreadSheet(spreadSheetId, range); Run the application by pressing F5 and you will see the output as JSON string.

If you have any questions then please let me know in the comments below.

Monday, November 7, 2016

Compare TFS 2015 Build definition changes

Today I discovered a feature that I felt was quite useful.  I always wanted to know what changes were made to build definition. If someone made a change to build defnition then you would want to know what changed. If you are using TFS 2015 or on Visual Studio Team Services then go to Build> Right click on Build definition and click on edit build definition. Under History, click on a build definition to see the diff button highlighted.

image

Once you click on that, you will see all the changes that were made to this build definition from the last time it was changed. Enjoy.

Sunday, October 30, 2016

Azure ARM template tips and tricks

In this post, I want to share some of the things I have learnt while authoring and debugging ARM templates.  In this post I am assuming you know what Azure ARM template deployment is. If you want to learn about ARM templates then go check this article about ARM. If you have any tips then please do share in the comment section below.

1. Azure ARM Quickstart Templates

If you are having issues with ARM templates then before you go crazy searching all over the internet I recommend you take a look at Azure Quickstart Templates. For any resource type, first check if you can create a simple resource of a particular type provided in the quickstart samples.  Compare what properties/attributes you have used v/s provided in the template. I learnt a lot via this quickstarts.

2. Authoring Templates via Visual Studio.

There are many ways to create ARM templates.  And I am not going to cover those here. But one of the ways you can create is via Visual Studio.  I am using Visual Studio 2015 Enterprise edition but Community edition will also work. You need the latest Azure SDK installed for this to work.  You might ask, “Why do I need to use Visual Studio when I can create .json file in any text editor?”.  While you can create ARM template using any text editor but knowing all the resource types and all their attributes/properties is not trivial. You might be spending hours guessing what is supported or not.  VS handles this pretty nicely. I want to show you this wonderful support built into VS.

i. Go to File | Create New Project | Under C# Templates | Cloud | Select Azure Resource Group

image

ii. You can select pre configured templates or Blank Template.  For experimentation and learning you can select one of these templates to see how ARM template is written. In this step Select Blank Template and Click Ok.

image

After you click ok you will see azuredeploy.json and azuredeploy.parameters.json files created as seen in the solution explorer window. The JSON outline window is shown next to solution explorer window.

image

As you click on any node in the JSON outline window those relevant sections in the azuredeploy.json file are highlighted on the right hand side.

image

Adding resources is very easy. Just right click on the resources node and click Add New Resource

image

In the next window you will be able to see all the resources you can select. You can add multiple resources from this window. Although new resources are not listed here but many common and frequently used resources are listed here.

image

I added three resources an appserverfarm, webapp and sqlserver and in the azuredeploy.json file it created some complex looking json file. And again as you click on each resource in the outline window relevant resource is highlighed in json file. You can remove a resource if you don’t like.  I think this is very easy way to create Azure ARM templates. If there are dependent resources then you will also see them populated in the above window.

image

Now I want to share some of the gotchas with Visual Studio while Authoring ARM templates.

a.  Not all the resources are listed in the Add Resource window.  So if you want to add resources that are not listed in this window then you will have to go to the Quickstart templates github repo and find if it is listed there and manually add it.

b. Visual Studio provides you with red squiggly lines when you make a mistake while authoring arm templates. This is both good and bad. Here is my two cents. I ran into an issue while creating multiple azure storage accounts of different types and I kept messing up my template.  VS was providing warnings but it didn’t help me figure out what the problem was. The error messages are very confusing and they don’t help in finding the real issues with your templates.  Sometime the template will show you red squiggly lines but the template will deploy successfully. 

3. Deploying using Visual Studio

Deploying ARM templates via Visual Studio is very easy. Just right click on the Project and Click Deploy. A dialog will appear asking you to select or create a new azure resource group. You can edit parameters from this dialog. One thing that I found very useful was validating the template before deploying the template.

4. Authoring Templates via Visual Studio Code

Read again. It is Visual Studio Code not Visual Studio 2015. You can author ARM templates in Visual Studio Code since it is a text editor. But VS Code lightens up as you install Azure Resource Manager Tools extension and you have an ARM template open. If there are mistakes then it will hightlight just like visual studio.  The same issues with this extension persist. And it is not easy to figure out those errors. For example, take the apiVersion (more on that see below) for Storage resource type. If I use an older api version [2015-08-01] but use newer attributes [sky,kind] then it will not say that these attributes are not supported.  Instead the error is value is not an accepted value. Which is correct but confusing.

image

5. Learning more about the template schema

Both Visual Studio and Visual Studio Code rely on the schema file to validate your json template. If you make a mistake authoring a resource then knowing more about how the template works will take you long way.  Here are my few tips on this json template.

a.  ApiVersion is a thing not to be taken lightly.

Azure ARM resource is managed by REST apis. And as new functionalities are added to these apis their apiversion numbers changes. Knowing which apiversion to use for a particular resource type is very important, otherwise you will spend countless hours trying to debug issues. I recommend this article if you are interested in knowing more. But here is a quick powershell snippet to help you find all supported apiversions for Microsoft.Storage provider. Replace this with the provider name you wish.

((Get-AzureRMResourceProvider -ProviderNamespace Microsoft.Storage).ResourceTypes | where-object ResourceTypeName -eq StorageAccounts).ApiVersions

If you start using newer properties/attributes for a resource but still keep using older apiversion number then it will not work. That brings to the next tip. How do you know which properties are supported in a particular apiVersion.

b. Knowing all the properties/attributes supported by a resource type for a given apiVersion.

Let’s take storage resource provider and we will find out what the properties supported for different apiVersions. First of all every .json template will begin with $schema attribute.

"$schema": https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#

Paste that url in your browser and it will download a deploymenttemplate.json file. Open and search for Storage.

image

$ref:”http://schema.management.azure.com/schemas/2015-08-01/Microsoft.Storage.json#/resourceDefinitions/storageAccounts

$ref:”http://schema.management.azure.com/schemas/2016-01-01/Microsoft.Storage.json#/resourceDefinitions/storageAccounts

You can see those two different api versions of Microsoft.Storage are provided.  Download each of these files and open them.

Inside the 2015-08-01 file navigate to the required section. You will see following.

image

The required node has only four properties listed. You should definitely have them. 

Inside the 2016-01-01 file navigate to the required section you will see the following.

image

For 2016-01-01 template, sku and kind are required. Hence if you provide sku and kind in 2015-08-01 template then it will not work.

c. Function/Expressions debugging.

You can do some very complex deployments using just this .json templates because they support many functions. Read more about template functions here. Sometime I evaluate a complex expression by create a new property inside outputs section and deploy the template with no resource in it.

image

And in the output window I see the result.

image

You can evaluate some very complex expressions. I wish there was a way you could validate those expressions. 

If you know more tips and tricks with Azure ARM template deployment/debugging please share with me in the comments section.

Sunday, May 1, 2016

Some thoughts on putting JavaScript, CSS, Images inside wwwroot folder

Last week, our team spent time in converting JavaScript to TypeScript and CSS to LESS and re-structuring our client side developer workflow.  In this post, I just want to focus on the wwwroot folder.  We do our web development using ASP.NET MVC and there is still lot of code in Web Forms too. We wanted to bring sanity to our growing CSS and JavaScript code as we are doing more and more client side code.  So we took some cues from ASP.NET core on how it is organizing code. And we followed basically the same approach.  Put everything inside wwwroot folder like below.

>wwwroot
>>>css
>>>js
>>>lib

We had discussion amongst us and here are our reasons for sticking with it. I know ASP.NET Core team must have had their reasons for doing it and our reasons may well be what they must have been thinking.

1. Everything client side is inside one folder.

Imaging currently if you are working inside ASP.NET MVC web app then you have something CSS inside Content folder, then JavaScript inside Scripts folder, then images in images folder. And these folders are separated apart. If you have lot of resources you may be spending time scrolling looking for a particular file. [Tip: If you use VSCode then Ctrl+P you can help you open a file very quickly]

2. wwwroot starts with w.

Alphabetically wwwroot it is at the end in folder list and you don’t have to go looking for stuff.  You know it is at the bottom and everything is in there. We thought of names like web, public but we stuck with wwwroot folder for now.

If you find any great discussion on this topic please share with me in the comments below.

Saturday, April 30, 2016

Use Get-CommandVariable to auto generate variables for a command

In this post, I want to share a silly new PowerShell command Get-CommandVariable that I wrote and it is available on github as a powershell module and you can put inside your modules folder and start using it.  “What problem this is trying to solve?” you might be thinking.

You start writing a powershell script and you want to pass variables to a command.  For example, you want to create a new AzureRM Web app using command New-AzureRMWebApp. Then you will have to write your command like this using $variables.

$ResourceGroupName = "mycustomresourcegroup"
$Name = "webapp01"
$Location = "centralus"

New-AzureRMWebApp  -ResourceGroupName $ResourceGroupName -Name $Name -Location $Location

Let’s understand what you had to go through. Type all the parameters and its variables with a $sign.  Copy those variables at the top so you can initialize them. That’s not the tricky part. The thing that gets me the most is trying to think names for those variables and then copy the variables at the top.

So Get-CommandVariable will create this all for you.  How?  Just specify for which command you want to generate automatic $variables and it will do it. In the below example, we are using New-AzureRMWebApp command and after it generates the output,  I just copy the text and put it inside ISE window.

PS C:>Get-CommandVariable –CommandName New-AzureRMWebApp
$ResourceGroupName = ""
$Name = ""
$Location = ""
New-AzureRMWebApp -ResourceGroupName $ResourceGroupName -Name $Name -Location $Location

If you are using powershell for quite a long time you may know that powershell commands have certain parameters as mandatory and others are not mandatory.  Then there are commands that will work different combination of parameters. And Get-CommandVariable will work in those scenarios as well.

For example, if you provide –ShowAll option it will generate variables for all the parameters not just mandatory ones.

PS C:>Get-CommandVariable –CommandName New-AzureStorageAccount -ListParameterSets
$StorageAccountName = ""
$Label = ""
$Description = ""
$AffinityGroup = ""
$Type = ""
$Profile = ""

New-AzureStorageAccount  -StorageAccountName $StorageAccountName -Label $Label -Description $Description -AffinityGroup $AffinityGroup -Type $Type -Profile $Profile

You can specify to list all the ParameterSets (think: different combinations of parameters) that a particular command expects. For example,

PS C:>Get-CommandVariable –CommandName New-AzureStorageAccount -ListParameterSets
Name
----
ParameterSetAffinityGroup
ParameterSetLocation

Then you can tell Get-CommandVariable to generate variables for a particular ParameterSet. In the below example we are chosing ParameterSetAffinityGroup
PS C:>Get-CommandVariable –CommandName New-AzureStorageAccount  -ParameterSetName ParameterSetAffinityGroup
$StorageAccountName = ""
$AffinityGroup = ""

New-AzureStorageAccount  -StorageAccountName $StorageAccountName -AffinityGroup $AffinityGroup
By default, I am showing just the mandatory parameters, since my goal to get going as quickly as possible. To show all parameters you will have to provide –ShowAll flag. So that’s it and if you think this might be useful to you then give it a try. You can provide feedback in the comments below or submit issues directly on github as well.

Sunday, April 3, 2016

PowerShell Tip: Start-Transcript and Stop-Transcript

You are trying to write a powershell script but you don’t know all the right comands to execute and what parameters to pass.  So you write a bunch of commands, out of which many don’t work and some work. After lots of experimentation you finally find the right commands with right parameters that would work for your script. You can do Get-history to get a list of all the commands that were executed. But that history only gets persisted as long as you have the powershell window open. Once you close the window that history is gone. I have closed the console window many times and found myself cursing for having done so because I couldn’t remember those commands. Now I have to again fiddle with those commands. Wouldn’t it be nice if there was something that would record everything you did in a text file? Once you are done experimenting you could tell it to stop recording your session and then you can use that text file for later reference. 
Start-Transcript and Stop-Transcript does just that. Before you start experimenting just tell PowerShell you want to record stuff in a text file and Start-Transcript will do that.  After you are done you can use Stop-Transcript to stop recording.  I like this feature. But there is more.
See in the Start-Transcript you have to provide a txt file and I don’t like providing path information with a uniquename everytime I want to do Start-Transcript and I want it to be automatic.  Below function will create a uniquename of the file based upon a timestamp. Now if you wish to provide a meaningful name then you can do that too. The script will append a unique timestamp to that name.  You can put this function into your profile and it will be available to you when the console loads.
function start-recording {
param(
 [string]$sessName
)
try { stop-transcript } catch {}
$uniqFileName = (get-Date).ToString('MMddyyyyhhmmss');
if([System.String]::IsNullOrEmpty($sessName)){
 Start-Transcript -Path "C:\Scripts\Transcripts\$uniqFileName.txt" -NoClobber 
}
else{
Start-Transcript -Path "C:\Scripts\Transcripts\$sessName$uniqFileName.txt" -NoClobber  
}
}

set-alias stop-recording stop-transcript 
I invoke command -  Start-Recording “webpackge” and a unique file name is created in my scripts/transcripts folder by that name.  The function also does stop-transcript if you execute the function start-recording again so it does saves existing session and starts recording another one.