Machine Learning Workflow

Posted on

There are distinct phases or steps that has been carried out to build a complete machine learning model. The sequence of the phases or steps can be defined as a Machine Learning Workflow.

A brief of Machine Learning

Machine Learning is a apparatus for turning information into knowledge. Machine learning (ML) is a type of artificial intelligence (AI) that allows software applications to become more accurate at predicting outcomes without being explicitly programmed to do so. Machine learning algorithms use historical data as input to predict new output values.

More Details: https://mdrijwanansari.wordpress.com/2020/05/18/machine-leaning-an-introduction/

Machine Learning Workflow

  • Specifying Problem
  • Data Preparation
  • Selection of Algorithm
  • Training the Model
  • Testing the Model.
Machine Learning Workflow
  1. Ask the right question: ML workflow starts with defining a specific question or problem with defined boundary. The right question will lead you to know about data and its preparation, identifying algorithm, testing model and overall outcome of Model. Some examples: 1. Suppose you need to predict an individual’s credit risk based on the information they gave on a credit application. Credit risk assessment is a complex problem, however ML solution can add new dimension for effective analysis. 2. A solution that will tell which tweets will get retweets.
  2. Data preparation: This is most important phase of Machine learning solution which absolutely depend on phase 1 i.e. Problem. Defining the problem or accurate question leads to know about data and its preparation. Almost 60% of overall time will be spent in data preparation. Data Preparation, in generally, means transforming raw data into a formatted which can be modeled using machine learning algorithms. This phase includes number of sub-steps like Data cleaning, Filtering, Manipulating, Scaling and Reduction, Sample and Splitting. Furthermore, the actions which are carried out for data cleaning or manipulating, are: adding column/rows, Clean missing data, edit metadata, join data, remove duplicate rows, categorization and many more. Another important point to note that we always split data into at least two parts: Training and Testing dataset which is also considered as a part of data preparation.
  3. Selecting the Algorithm: Choosing the algorithm is solely depend upon the problem (Phase 1: Question) for which we are designing ML model. There are numerous well established algorithms available and are ready to apply for machine learning solution. Anomaly Detection, Classification, Clustering, Regressions are the types of model or algorithm which are categorized based on the problems. There are, furthermore, many matured algorithms are available under each category. Some of the examples of machine learning algorithms are Linear Regression, Neural Network Regression, Two class Decision Forest, Multiclass Decision Jungle, K-means Clustering, PCA-Based Anomaly Detection etc. As a ML solution, we never work on designing or creating algorithms and this is not part of machine learning solution. Nevertheless, we only do trails with different established algorithms and find suitable one for our problem.
  4. Training the Model: This stage is also known as the fitting stage, where the prepared and formatted data are used in selected algorithm to train the model. This process, alternatively means the model will learn from the prepared training data.
  5. Testing and Evaluating the Model: As in earlier stage ( data preparation), data are divided into two parts: Training and Testing dataset. In this stage, testing data are used to check score of model and to know how well it performs. Test data are feed into the trained model and evaluate the output with actual data to know the accuracy level.
  6. Maintenance: This is also one the curial part to maximize the model performance where the new or recent data are again used for model and proceed through all the processes.

Most of the phases are repetitive depending on result of testing and evaluation. If the evaluation score is below the expectation, then again the process will step back by one phase and select another algorithm to process further. This is continuous process of machine learning. Sometime, we may need to jump back in data preparation phase based on evaluation.

Conclusion

Machine Learning workflow is a combination of the defined steps in a specific succession. It starts with defining problem and processes through Data preparation, Algorithm Selection, Training Model, Testing and Evaluation respectively. More importantly, the later phases are iterative depending upon the evaluation. Maintenance, in addition, has also great significance in machine learning performance.

Machine Leaning | An Introduction

Posted on

Introduction

Machine learning (ML) is a type of artificial intelligence (AI) that allows software applications to become more accurate at predicting outcomes without being explicitly programmed to do so. Machine learning algorithms use historical data as input to predict new output values.

Historical or back data has been primarily used for following two purposes until a few decades ago:

  • As a Record to know what happened.
  • To identify the root cause of why it happened.

Despite the fact that the reasons mentioned are valid, we have added a dimension in the last decade where data is being utilized for predicting what could potentially happen in the future. Then comes Machine Learning which play significant role in doing so. Machine learning is a subset/subfield of Artificial Intelligence. Generally, the main aim of Machine learning is to understand the structure of data and apply the best possible models that can be utilized or identify hidden pattern. Developing machine learning model is one the key factors in predicting a future problem which again requires machine learning algorithms. There are numerous machine learning algorithms which have been developed and mature enough to solve various real world business problems.

Although machine learning is a field within computer science, it differs from traditional computational approaches. In traditional computing, algorithms are sets of explicitly programmed instructions used by computers to calculate or problem solve. Machine learning algorithms instead allow for computers to train on data inputs and use statistical analysis in order to output values that fall within a specific range. Because of this, machine learning facilitates computers in building models from sample data in order to automate decision-making processes based on data inputs.

Using Machine learning, information is being turned into knowledge. In last 5-6 decades, enormous data has been recorded or collected which will be of no use if we don’t utilize or analyze to find hidden patterns. In order to find useful and significant patterns with the complex data, we have several Machine Learning techniques available to ease our struggle for discovery. Subsequently, those identified hidden patterns and knowledge of the problem can be helpful to performs complex decision making and predict future occurrence.

Machine Leaning Methods

the classification of data on broad scale can be done into two categories, namely Labeled data and Unlabeled Data

There are many approaches that can be taken when conducting Machine Learning. They are usually grouped into the areas listed below. Supervised and Unsupervised are well established approaches and the most commonly used. Semi-supervised and Reinforcement Learning are newer and more complex but have shown impressive results.

According to the famous Machine Learning concept No Free Lunch Theorem, there is no single algorithm that will work for all tasks i.e. each task has it’s own idiosyncrasies.

Let’s explore supervised and unsupervised methods in more details.

Supervised Learning

Supervised learning is primarily used to address two kinds of problems (Identifying Value): regression and classification problem.

In supervised learning, the goal is to learn the mapping (the rules) between a set of inputs and outputs.

For example, the inputs could be the weather forecast, and the outputs would be the visitors to the beach. The goal in supervised learning would be to learn the mapping that describes the relationship between temperature and number of beach visitors.

In supervised learning, the computer is provided with sample inputs that are labeled with their specific outputs. The motive of this method is for the algorithm to be able to “learn” by comparing its actual output with the “taught” outputs to find errors, and modify the model accordingly. Supervised learning therefore uses patterns to predict label values on additional unlabeled data.

A common use case of supervised learning is to use historical data to predict statistically likely future events. It may use historical stock market information to anticipate upcoming fluctuations, or be employed to filter out spam emails. In supervised learning, tagged photos of dogs can be used as input data to classify untagged photos of dogs.

Unsupervised Learning

Unsupervised learning is primarily used to address clustered data. (Identifying data pattern).

In unsupervised learning, only input data is provided in the examples. There are no labelled example outputs to aim for. But it may be surprising to know that it is still possible to find many interesting and complex patterns hidden within data without any labels.

An example of unsupervised learning in real life would be sorting different color coins into separate piles. Nobody taught you how to separate them, but by just looking at their features such as color, you can see which color coins are associated and cluster them into their correct groups.

In unsupervised learning, data is unlabeled, so the learning algorithm is left to find commonalities among its input data. As unlabeled data are more abundant than labeled data, machine learning methods that facilitate unsupervised learning are particularly valuable.

The goal of unsupervised learning may be as straightforward as discovering hidden patterns within a dataset, but it may also have a goal of feature learning, which allows the computational machine to automatically discover the representations that are needed to classify raw data.

Semi-supervised is a middle road between supervised and unsupervised approaches. There will be mixture of a small amount of labelled data with a much larger unlabeled dataset which reduces the burden of having enough labelled data. Consequently, many problems arise which need solution with Machine Leaning.

The last one, Reinforcement is less common and much more complex, however it has an incredible applications. If you’re familiar with psychology, you’ll have heard of reinforcement learning. If not, you’ll already know the concept from how we learn in everyday life. In this approach, occasional positive and negative feedback is used to reinforce behaviours. Think of it like training a dog, good behaviors are rewarded with a treat and become more common. Bad behaviors are punished and become less common. This reward-motivated behavior is key in reinforcement learning.

Games are very popular in Reinforcement Learning research. They provide ideal data-rich environments. The scores in games are ideal reward signals to train reward-motivated behaviors. Additionally, time can be sped up in a simulated game environment to reduce overall training time. A Reinforcement Learning algorithm just aims to maximize its rewards by playing the game over and over again. If you can frame a problem with a frequent ‘score’ as a reward, it is likely to be suited to Reinforcement Learning.

In next chapter, we explore more about approaches, algorithms and will proceed to real solutions of Machine Learning.

Conclusion

Machine Learning (Artificial Intelligence) is now became a important of our daily life which incredible applications. Here I have introduced Machine Learning and approaches with real world business cases. Keep your eye out for more blogs!!

Add-Migration : The term ‘Add-Migration’ is not recognized

Posted on Updated on

Details error:

PM> Add-Migration 'Intialize Database'
Add-Migration : The term 'Add-Migration' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the 
path is correct and try again.
At line:1 char:1
+ Add-Migration 'Intialize Database'
+ ~~~~~~~~~~~~~
    + CategoryInfo          : ObjectNotFound: (Add-Migration:String) [], CommandNotFoundException
    + FullyQualifiedErrorId : CommandNotFoundException
 

Solution

Simply, install Microsoft.EntityFrameworkCore.Tools package from nuget:

Install-Package Microsoft.EntityFrameworkCore.Tools -Version 3.1.6

You can also use this link to install the latest version: Nuget package link

Tips: Install latest stable version.

Sometimes, the error persists because of cache, so restart can solve the issue without doing anything.

Invalid option ‘7.3’ for /langversion; must be ISO-1, ISO-2, Default, Latest or a valid version in range 1 to 7.1.

Posted on Updated on

This compiler error occurs sometime because of version change where dotnet compiler version and c# version are incompatible.

Error:

Compilation Error
Description: An error occurred during the compilation of a resource required to service this request. Please review the following specific error details and modify your source code appropriately.            
Compiler Error Message: CS1617: Invalid option ‘6’ for /langversion; must be ISO-1, ISO-2, 3, 4, 5 or Default


Compiler Error Message: Invalid option ‘7.3’ for /langversion; must be ISO-1, ISO-2, Default, Latest or a valid version in range 1 to 7.1.


Solution:

Solution 1: The easiest way is to update the following NuGet packages (whichever installed) to resolve the problem:

  • Microsoft.CodeDom.Providers.DotNetCompilerPlatform
  • Microsoft.Net.Compilers

Solution 2: Pay attention to compiler “type” in the Web.Config file, when changing Framework version:

for 4.5 and C#5 –

type="Microsoft.CSharp.CSharpCodeProvider...

for 4.6 and C#6 –

type="Microsoft.CodeDom.Providers.DotNetCompilerPlatform.CSharpCodeProvider, Microsoft.CodeDom.Providers.DotNetCompilerPlatfor

Workaround:

<system.codedom>
    <compilers>
      <compiler language="c#;cs;csharp" extension=".cs" type="Microsoft.CSharp.CSharpCodeProvider, System, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" warningLevel="4" compilerOptions="/langversion:6 /nowarn:1659;1699;1701">
          <providerOption name="CompilerVersion" value="v4.0"/>
      </compiler>
      <compiler language="vb;vbs;visualbasic;vbscript" extension=".vb" type="Microsoft.CodeDom.Providers.DotNetCompilerPlatform.VBCodeProvider, Microsoft.CodeDom.Providers.DotNetCompilerPlatform, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" warningLevel="4" compilerOptions="/langversion:14 /nowarn:41008 /define:_MYTYPE=\&quot;Web\&quot; /optionInfer+"/>
    </compilers>
  </system.codedom>

Attribute  “compilerOptions” needs to be replaced:

 “langversion:6” -> ‘langversion:5

How to update files of Azure hosted site

Posted on

Rijwan Ansari

There are few steps need to follow to edit and update files of application which are hosted in azure. Follow following steps.

  1. Login to azure portal, Click on App Services and Go to App Service Editor (Preview) under the Development Tools.

1

2. Click on GO . All the files of application will be listed on left side.

22.1

3. Click on file that you want to edit, and file will be opened as shown in below and change according to your need and then save it. If you forget to save your changes then don’t worry, it saves the newly changed values automatically.

3

View original post

Change the session timeouts in SharePoint sites

Posted on

Some cases, our clients (SharePoint Client) may request to change the session time out of SharePoint sites. It is allowed to set timeout of the user session in SharePoint so that users are logged out after certain time of inactivity. Additionally, current page state will be expired based configure timeout. There are multiple ways, we can configure session timeout.

From central Administrator:

Go to SharePoint  Central Admin
Go to Application Management
Select Configure Session State
Under Timeout, change the default value stored there.

Power Shell ( SharePoint Management Shell)

This is handy for forms which have to be filled within certain period of time before they expire. Alternatively, if you want to extend the timeout – you can do that.

$spsite = Get-SPSite("[site collection url]")
$webApp = $spsite.WebApplication
$webApp.Enabled = $true
$webApp.Expires = $true
$webApp.Timeout = New-TimeSpan -Hours 2 -Minutes 30
$webApp.Update()

This will enable session expiration and set page content to timeout after 2 hours and 30 minutes.

If you’re using claims authentication and would like for your provider to expire sessions after certain period of inactivity, here is how to do that with Power Shell:

$tokenservice = Get-SPSecurityTokenServiceConfig
$tokenservice.UseSessionCookies = $true
$tokenservice.LogonTokenCacheExpirationWindow = New-TimeSpan -Minutes 5
$tokenservice.Update()

Logon expiry will be set to 5 mins.

An unhandled exception occurred: Could not find module “@angular-devkit/build-angular”

Posted on Updated on

This error occurs because of missing dev dependency which is introduced newly in Angular 6.0 and above.

Solution:

Install @angular-devkit/build-angular

npm install --save-dev @angular-devkit/build-angular
Or
yarn add @angular-devkit/build-angular --dev

In some cases

npm install

ng update

and finally

npm update

Regards

Create a Database View using Entity Framework (EF) Code First Approach

Posted on Updated on

How to create a database view using Entity Framework Code first approach?

User case scenario:

There are several cases that the aplications may need to display the data by combiniting two or more tables, sometimes even more than 7-8 tables. In such scenario, using entity framework may results in slow performance beacause we need to process by selecting data from a table then running loop to for another tables.

However, database itself has features, stored procudures or creating views which can is most recommended and result in best performance. This blog will show to how to overcome the problem by creating view in entity framework.

Option 1

Create a view combining multiple tables in databse manually, subcequently add a entity for the view. Finally, we can add ignore for the entity OnmodelCreating enitity bulder.

Sample code:

protected override void OnModelCreating(ModelBuilder modelBuilder)
{
  if (IsMigration)
    modelBuilder.Ignore<ViewEntityName>();
 ...
}

Option 2

Alternatevely, you can create extension or property for handling view in database. In this option, we have to create view manually in databse then add extension or property.

Sample code

//Property
class DBContext
{
    public IQueryable<YourView> YourView 
    {
        get
        {
            return this.Database.SqlQuery<YourView>("select * from dbo.ViewName");
        }
    }
}

Extension

static class DbContextExtensions
{
    public static IQueryable<ViewNameModel>(this DbContext context)
    {
        return context.Database.SqlQuery<ViewNameModel>("select * from dbo.ViewName");
    }
}

There are some other alternatives as well, however, I prefer these options as they are easy to implement.

Hence, these are some quick way to implement database views in entity framwork code first approach.

Create SQL Server Database Project With Visual Studio. Create or compare two Databases.

Posted on Updated on

Here, we are going to learn about SQL Server Database Project (Template) available in Visual Studio. I will cover following points:

  • Introduction of SQL Server Database Project.
  • Create New SQL Server Database Project.
  • Import database schema from an existing database, a .sql script file or a Data-tier application (.dacpac) the Project.
  • Publish to create new Database in SQL server.
  • Compare Two Database and find differences.
  • Compare, Update or Create update Scripts.

Database plays a most important role in any application and it becomes difficult to manage the project when number of tables, views, stored procedures increases.

Consider scenarios where multiple developers are working on a project for next release, some are working on Bugs or adding new features which again require some or many changes in database. Most of the cases developers take note of DB (Database) changes manually. Some time they miss some changes which cost in production. There are many cases that Dev, UAT and production DB are different which is again hassle to identify the differences.

There are number of tools available in market for comparing DBs but are costly or paid solution.

So, in this article we will discuss and learn about SQL Server Database Project with is available in Visual Studio which is free. Yes Free!!

Prerequisites: Visual Studio (2015 or 2017) and MS SQL  Server. I am using VS2017 and SQL Server 2017 Developer for illustration.

Introduction:

You can create a new database project and import database schema from an existing database, a .sql script file or a Data-tier application (.dacpac). You can then invoke the same visual designer tools (Transact-SQL Editor, Table Designer) available for connected database development to make changes to the offline database project, and publish the changes back to the production database. The changes can also be saved as a script to be published later. Using the Project Properties pane, you can change the target platform to different versions of SQL Server (including SQL Azure). (copied MS Docs)

Create New SQL Server Database Project

  1. Open Visual Studio and create a blank solution.

2. Add a Project.

3. Select SQL Server from Left panel and SQL Server Database Project. Name the project ( here SampleAccount.)

Import database schema from an existing database, a .sql script file or a Data-tier application (.dacpac) the Project

4. Right click on project and select import. There will be three options: Data-Tier Application (.dacpac), Database, Scripts. Select Database.

5. Provide Connection string. i.e. Select/Insert Server, authentication type, Database.

6. Set the import settings as highlighted.

7. Click Start which will show the progress window as:

After Finish we will see the tables, views and stored procedures in our project.

This is how we can add the database in the SQL Server Database Project.

Publish to create new Database in SQL server.

The database above in the project can be used to create new database with same schema.

  1. Right click on the project and choose the publish option.

2. Provide Connection: Server name, Authentication Type , credentials. If we want to publish as new database then choose default database, or choose specific database to publish.

3. We can generate script or publish directly. You can explore advance options as well to apply rules on publish.

Now our new database is created or generated script can used to create.

Compare Two Database and find differences.

This section we will how to compare two database to identify the differences like Dev, UAT or UAT, Prod likewise.

  1. Right click in the SQL Database project and choose schema compare.

2. Select Source and Target database. Prove connection. Note: we can compare two databases or another database with the project database.


3. Click Compare

The we will the differences between source and target database.

There will be deleted, edited and added objects as shown.

Compare, Update or Create update Scripts.

Another we can update the target database or generate update script to push source version.

  1. Complete the compare part. then there is option to do the job.

Update option directly update to target database and generate script will give sql script.

From above consideration, we don’t need to create migration script or plan for DB. We can easily compare the relevant database tables, views and stored procedures in no time.

So, any changes can be incorporated and make database managing in effective and efficient way.

SharePoint Error: Updates are currently disallowed on GET requests. To allow updates on a GET, set the ‘AllowUnsafeUpdates’ property on SPWeb.

Posted on Updated on

This error occurs while clicking General setting of Web Application in SharePoint Central Administrator.

SharePoint Central Administrator >> Manage Web Applications >> Select Web Application (From List) >> General Settings >> General Settings.

Resolution:

Using Powershell Scripts ( SharePoint Management Shell):

Run this powershell script:

$web = get-spwebapplication http://SharepointUrl
$web.HttpThrottleSettings
$web.Update()

Kind Regards!