OakLeaf Systems

Web Name: OakLeaf Systems

WebSite: http://oakleafblog.blogspot.com

ID:229821

Keywords:

OakLeaf,Systems,

Description:

keywords:
description:
OakLeaf Systems

OakLeaf Systems is a Northern California software consulting organization specializing in developing and writing about Windows Azure, Windows Azure SQL Database, Windows Azure SQL Data Sync, Windows Azure SQL Database Federations, Windows Azure Mobile Services and Web Sites, Windows Phone 8, LINQ, ADO.NET Entity Framework, OData and WCF Data Services, SQL Server 2008+, and Visual Studio LightSwitch. TIP: Click the latest item's title below to speed loading.

Sunday, January 12, 2014 Executing a Service Provider Licensing Agreement (SPLA) for Windows Azure VM Remote Desktop Services

As noted in my earlier Microsoft Finally Licenses Remote Desktop Services (RDS) on Windows Azure Virtual Machines article, last updated 12/17/2013, organizations providing customers access to Windows Applications running on Windows Azure Virtual Machines (WAVMs) via Remote Desktop Services (RDS), with or without RDWeb access, must enter into an indirect Microsoft Service Provider Licensing Agreement (SPLA) and purchase Subscriber Access Licenses (SALs) from a reseller appointed by Microsoft. The only prerequisites for an SPLA are your firm must be a Registered (or higher) Microsoft Partner and have a Microsoft (Live) ID.

When this post was written in mid-January 2014, Microsoft had appointed eight US and seven Canadian resellers in the US. I chose Dell Computer, Inc. as OakLeafs reseller because they offered this simple sign-up procedure:

1. Send a message to SPLA_US@dell.com or SPLA_Canada@dell.com with your Partner organization name, ID number and physical location (city or state/province,) as well as the name of the individual having authority to sign the SPLA.

2. A Dell representative will acknowledge your request.

3. A few hours later, youll receive a message with Action Required: Your Microsoft License Agreement is ready for electronic signature{~########:1~} as the subject from eagreements@microsoft.com with the following information:

Dear Roger Jennings,

Your Microsoft Volume Licensing Agreement has been created by your Microsoft Channel Partner and is ready for acceptance and electronic signature.

Contract Package Number: PKG######### Agreement Number: ######## Customer Number: XXXX####

Action Required

Please click the link below in order to review and sign the agreements on behalf of your company. If you no longer have signing authority, please notify your Channel Partner immediately and do not use the below link.

Note: You will be prompted to sign into eAgreements using a valid Windows Live ID. If you do not have a Windows Live ID, you are required to create one and will be prompted to do so after you click the link below.

Clicking the provided URL opens the following Volume Licensing Welcome page:

Sign in with your Windows Live ID to open the eAgreements Registration page:

Click each of the four Document Name links, download the following Word documents and review them or route them to your companys legal department for review and approval:

Microsoft Business and Services Agreement: MBSA Agreement.docx Service Provider License Agreement (Indirect): Master Agreement.docx (form is prefilled) Signature Form: Signature Form.docx End User License Terms: SPLA End User License Terms.docx

After review and approval of the four documents, select the I Confirm option button to expand the bottom section. Type the signatorys name, job title and, optionally, purchase order number, as shown here:

Click Submit to send the electronically signed form to Microsoft. Click OK to acknowledge the success message:

Tip: If you receive an error message after clicking submit, try again at least once.

Youll receive a Confirmation of Microsoft Volume Licensing Agreement Acceptance and Submission{~########:1~} by e-mail from Microsoft a few minutes after submission. The message contains the following (in part):

Note that this agreement is not wholly formed until accepted and activated by Microsoft.

Once the license agreement is signed by Microsoft, you will receive a Welcome email from Microsoft confirming the activation of your organizations license agreement. The Welcome email will also provide information about how you can track and manage your organizations software licenses online.

Ill update this post when I receive the activation message.

Friday, January 10, 2014 Uptime Report for My Live Windows Azure Web Site: December 2013 = 99.95%

My (@rogerjenn) Android MiniPCs and TVBoxes blog runs WordPress on WebMatrix with Super Cache in Windows Azure Web Site (WAWS) Standard tier instance in Microsofts West U.S. (Bay Area) data center and ClearDBs MySQL database (Venus plan). I converted the site from the Shared Preview to the Standard tier on September 7, 2013 in order to store diagnostic data in Windows Azure blobs instead of the file system.

Service Level Agreements arent applicable to the Web Services Shared tier; only sites in the Standard tier qualify for the 99.9% uptime SLA.

Running a Shared Preview WAWS costs ~US$10/month plus MySQL charges Running a Standard tier instance costs ~US$75/month plus MySQL charges

I use Windows Live Writer to author posts that provide technical details of low-cost MiniPCs with HDMI outputs running Android JellyBean 4.1+, as well as Googles new Chromecast device. The site emphases high-definition 1080p video recording and rendition.

The site commenced operation on 4/25/2013. To improve response time, I implemented WordPress Super Cache on May 15, 2013. I moved the sites diagnostic logs from the file system to Windows Azure blobs on 9/7/2013, as reported in my Storing the Android MiniPCs Sites Log Files in Windows Azure Blobs post.


Heres Pingdoms graphical Uptime report (first page in Downtime mode) for December 2013:

Note: Downtime is exaggerated because Pingdoms timing resolution is five minutes.

And heres their Response Time report for the same period:

Response time is slowed by the length and number of images of this blogs posts.

The Windows Azure Management Portal displays resource utilization for a maximum period of one prior week:

I plan to report and log uptime values monthly with pages similar to OakLeaf Systems Uptime Report for my Live OakLeaf Systems Azure Table Services Sample Project posts, which has more that two full years of uptime and response time data as of June 2013.

Month Year Uptime Downtime Outages Response Time December 2013 99.95% 00:21:00 4 1,868 ms November 2013 99.92% 00:35:00 5 1,877 ms October 2013 99.63% 02:40:00 23 2,061 ms September 2013 99.81% 01:20:00 7 1,816 ms August 2013 99.71% 02:10:00 24 1,786 ms July 2013 99.37% 04:40:00 45 2,002 ms June 2013 99.34% 04:45:00 30 2,431 ms May 2013 99.58% 03:05:00 32 2,706 ms

Note: The usual 99.9% uptime Service Level Agreement (SLA) applies only to Windows Azure Web Sites running in the Standard tier. The Android MiniPCs and TVBoxes site was upgraded from the Shared to Standard tier on 9/7/2013.

Technorati Tags: Windows Azure Web Services,WAWS,Uptime Reports,Pingdom,Response Time Reports,WordPress,WordPress Super Cache,Android Jelly Bean,Android,MiniPC,TVBox

Wednesday, January 08, 2014 Uptime Report for my Live OakLeaf Systems Azure Table Services Sample Project: December 2013 = 100.00%

My (@rogerjenn) live OakLeaf Systems Azure Table Services Sample Project demo project runs two small Windows Azure Web role compute instances from Microsofts South Central US (San Antonio, TX) data center. This report now contains more than two full years of monthly uptime data.


Heres the detailed downtime report from Pingdom for December 2013:


Following is the first page of the detailed Pingdom response time data for December 2013:

You can check the sites Public Status Page (shown for 12/2/2013) by clicking here:


This is the thirty-first uptime report for the two-Web role version of the sample project since it was upgraded to two instances. Uptimes below SLA 99.9% minimums are emphasized. Reports will continue on a monthly basis.

Month Year Uptime Downtime Outages Response Time December 2013 100.00% 00:00:00 0 666 ms November 2013 99.99% 00:00:05 1 796 ms October 2013 100.00% 00:00:00 0 720 ms September 2013 99.95% 00:00:20 2 1,094 ms August 2013 100.00% 00:00:00 0 850 ms July 2013 100.00% 00:00:00 0 814 ms June 2013 100.00% 00:00:00 0 750 ms May 2013 100.00% 00:00:00 0 854 ms April 2013 100.00% 00:00:00 0 842 ms March 2013 100.00% 00:00:00 0 717 ms February 2013 98.78% 08:10:00 3 799 ms January 2013 100.00% 00:00:00 0 628 ms December 2012 100.00% 00:00:00 0 806 ms November 2012 100.00% 00:00:00 0 745 ms October 2012 100.00% 00:00:00 0 686 ms September 2012 100.00% 00:00:00 0 748 ms August 2012 99.92% 00:35:00 2 684 ms July 2012 100.00% 00:00:00 0 706 ms June 2012 100.00% 00:00:00 0 712 ms May 2012 100.00% 00:00:00 0 775 ms April 2012 99.28% 05:10:08 12 795 ms March 2012 99.96% 00:20:00 1 767 ms February 2012 99.92% 00:35:00 2 729 ms January 2012 100.00% 00:00:00 0 773 ms December 2011 100.00% 00:00:00 0 765 ms November 2011 99.99% 00:05:00 1 708 ms October 2011 99.99% 00:04:59 1 720 ms September 2011 99.99% 00:05:00 1 743 ms August 2011 99.98% 00:09:57 2 687 ms July 2011 100.00% 00:00:00 0 643 ms June 2011 100.00% 00:00:00 0 696 ms

Update 5/7/2013: Added links to those previous uptime reports currently accessible. I dont know why some posts are missing from the Internet.

The Azure Table Services Sample Project

See my Republished My Live Azure Table Storage Paging Demo App with Two Small Instances and Connect, RDP post of 5/9/2011 for more details about the Windows Azure test harness instance.

I believe this project is the oldest continuously running Windows Azure application. I first deployed it in November 2008 when I was writing Cloud Computing with the Windows Azure Platform for Wiley/WROX, which was the first book published about Windows Azure.

The application runs the default set of Windows Azure Diagnostics. About 5 GB of Event Counter data provides the source data for my Generating Big Data for Use with SQL Azure Federations and Apache Hadoop on Windows Azure Clusters post of 1/8/2012.

For more details about the Table Services Sample project or to download its source code, visit its Microsoft Pinpoint entry. Following are Pinpoint listings for four three other related OakLeaf sample projects, two of which are live in the South Central US data center:

OakLeaf Public Data Hub: US Air Carrier Flight Delay DataSet (5/1/2012, live) Microsoft Codename Data Explorer Mashup with Social Analytics Data (1/24/2012, live) Microsoft Codename Social Analytics WinForms Client Sample (11/22/2011) SQL Azure Reporting Services Preview Sample (10/24/2011, live)

This report blogged from the Consumer Electronics Show 2014, Las Vegas, Nevada

Technorati Tags: Windows Azure,Azure,Windows Azure Cloud Services,Web Roles,Windows Azure Table Storage,Uptime Reports,Pingdom,Cloud Computing

Saturday, January 04, 2014 Microsofts Project Siena Needs a Windows Azure Mobile Services Sample App

S. Soma Somasegar ( @SSomasegar) asserts Microsofts new Project Siena is intended for business experts, business analysts, consultants and other business users with the imagination to conceive an app for todays mobile devices in his Project Siena: Enabling Business Users to Create Mobile Apps for the Enterprise post of 12/19/2013 to TechNets Project Siena blog.

Project Sienas tagline is

Build powerful business apps, no coding necessary.

Soma cites as examples:

Apps to support conversations, where a customer can reach over and touch the employees screen, make changes and ultimately own the solution Apps with the smarts to help users make on-the-spot decisions by offering choices first and allowing trade-offs later Apps for tasks that involve capturing real world information through photos, videos and voice Project Sienas Tagline Is an Oxymoron

I have never seen a real-world business app, powerful or not, that didnt require at least some custom Visual Basic, VBA, C#, Java, JavaScript or code in another programming language. For its Visual Studio LightSwitch rapid application development (RAD) tool, Microsoft claims:

When you use LightSwitch, much of the repetitive work is done for you and, in fact, you can create a LightSwitch application without writing any code at all! For most applications, the only code you have to write is the code that only you can write: the business logic.

Tim Andersons ( @timanderson) Figuring out Project Siena: a Windows 8 app to build Windows 8 apps article of 1/4/2014 describes his frustrations with fathoming the functions required to program a simple todo list:

A couple of weeks back I took a look at Project Siena, a preview of a new tool for building Windows 8 apps. Project Siena features a simplified user interface builder, an Excel-like expression language, and data-bound controls. It generates Windows 8 JavaScript apps. Project Siena is itself a Windows Store app, and runs fine on Windows RT (the ARM version). I have been using it successfully on Surface 2, on which it runs sweetly. [Emphasis added.]

When I first looked at Project Siena I tried to build the same first app that I have used for numerous simple tests of development tools over the years: a to-do list. I was impressed by how easy it was to create the user interface, but unable to work out the code to complete it. Unless I missed it, the key information is not included in any of the initial documentation. I found this disappointing, since it has been easy to work out the code in every other programming environment I have tried.

In Siena, data is stored in Collection objects, and you can bind a listbox to a collection. By default, a new listbox is bound to an object called ListboxSample, but you cannot use it for this; if you try, you get a squiggly line error with the message that ListboxSample is not a collection.

Instead, you have to create your own collection object. In Siena, you declare a variable by using it and its type is inferred. Enter this for the OnSelect property of the Add button:

Collect(mycollection,{Value: InputText1!Text})

This is the code that took me so long to work out. The Collect function adds an item to a collection. If the collection does not already exist, it creates it. The first argument to Collect is a collection object, and the second, an item. What is an item? In effect, a record or row in a table. The syntax for an item in Siena is:

{Fieldname1: fieldvalue1,Fieldname2: fieldvalue2,}

where the dots represent additional fields as required. Therefore, the code I entered for the Add button creates or appends an item with a single field, called Value, to a collection called mycollection.

Now you can select the listbox and tap Data and then Items. The collection called mycollection magically appears for selection. Select it. In the case of multi-field collections, you can also choose which field appears in the list. Only one field it seems; yes, Siena needs a grid control. [Emphasis added.]

Defining function expressions to respond to events sounds like programming to me. The Microsoft Project Siena Function Reference lists two operators and 71 functions, many of which have complex argument lists. Even inline If() expressions are supported:

Usage:
If(predicate1, expression1,
[predicate2, expression2,
...
predicateN, expression,
default])

Returns the result of evaluating the expression that corresponds to the first matching predicate. If none of the predicates match, default is evaluated and its result gets returned.

The Project Siena expressions remind me of those available to Microsoft Access developers from the Jet Expression Service.

Instructions for Importing Windows Azure Mobile Services

From the Microsoft Project Siena Help Topics:

How do I import Windows Azure Mobile Services ([W]AMS) data into my app?

If your enterprise uses Windows Azure Mobile Services, contact your IT department or the administrator of the account for the account details. Then add [W]AMS tables as data sources to your app:

Click the + control in the Data Sources backstage. Click Azure Mobile Services in the Data Sources menu. You will be prompted for the URL (usually of the form https://EnterpriseName.azure-mobile.net) and AppKey (this is generated when the account is set up; use an application key and not the master key). Enter this information and click the search icon. All the tables in your [W]AMS instance are listed. Select one or more of the instances and then click Import Data to import the data into your app.

Note that in the current release, we support a simplified and lightweight interface for accessing [W]AMS data. As part of the [W]AMS setup, your IT administrator will also need to create a table named zz_config in your [W]AMS instance for Siena to be able to interact with it. This table should contain the following columns:

Key: This should be set to table. Value: String value that stores the table name.

The zz_config table should contain one row per table that you want to expose to Siena. For example, if you want to allow Siena to access Employee and Sales tables, your zz_config table will have the following entries:

Key Value table Employee table Sales

Apparently, the Project Siena team didnt get the memo that Windows Azure whatever, not Azure whatever, is the naming convention.

Connecting a Project Siena App to Other Data Sources

More from the Microsoft Project Siena Help Topics:

How do I import data into my app?

Links to data sources can be added to your apps for importing data. Create these links through the Data Sources backstage, which can be accessed as follows:

Display the app bar by right-clicking anywhere on the canvas or by swiping down from the top of the canvas. Click File, and then click Data Sources. Click the + control to bring up a list of possible data sources.

The current release of Siena supports the following types of data sources that are displayed as options when you click the + control.

Excel Windows Azure Mobile Services (AMS) REST RSS Feed SharePoint Lists

Note that your SharePoint site, REST web services, or RSS Feed should not be running on the same device as the published app. If this condition is not met, the app will not be able to access the service because of a security restriction with Windows 8 apps that prevents from calling into web services running on the same device.

How do I import Excel data into my app?

Perform the following steps to import Excel data into your app:

Click the + control in the Data Sources backstage. Click Excel in the Data Sources menu. This will bring up the Windows 8 file picker. Browse to the location where your Excel file is stored, select it, and click Open. All the tables within the Excel workbook you selected are displayed. Select one or more tables from the displayed list, and then click Import Data to import the data into your app.

Note that only Excel tables will be imported (see Create an Excel table). If the data has been imported successfully, the first three rows of the table will be displayed.

How do I use REST connectors?

Consume data from REST connectors in your app by following these steps:

Click the + control in the Data Sources backstage. Click REST in the Data Sources menu. Enter the URL for your REST service. Choose an authentication method from the drop-down list box. Select the Enable headers check box, if you want to specify any additional custom headers. Import the data. All the tables exposed through your REST service are imported into your app.

Siena can fetch data from REST services using GET requests (your proxy must support GET requests at the supported endpoints) if the following requirements are satisfied. Please contact your IT administrator about these requirements if you are having trouble connecting to your enterprises REST services.

How do I import SharePoint lists into my app?

Perform the following steps to import data from SharePoint lists:

Click the + control in the Data Sources backstage. Click SharePoint in the Data Sources menu. Enter the URL to your SharePoint site, and then click the search icon. All the lists on your SharePoint site to which you have access are displayed. Select one or more lists, and then click Import Data to import the data into your app.

Note that in the current release of Siena, a link to SharePoint data is not livechanges made to the list on the server are not immediately reflected in the app. When loading, the app takes a snapshot of the data on the SharePoint server.

How do I import RSS feed data into my app?

Perform the following steps to import data from RSS feeds:

Click the + control in the Data Sources backstage. Click RSS Feed in the Data Sources menu. Enter the URL to your RSS Feed and click the search icon All the RSS feeds in your specified URL to which you have access are displayed. Select one or more feeds, and then click Import Data to import the data into your app.

Note that in the current release of Siena, a link to RSS Feed data is not livechanges made to the feed on the server are not immediately reflected in the app. When loading, the app takes a snapshot of the data on the RSS Feed server. In the current version of Siena, only Atom and RSS 2.0 feeds are supported.

The Missing Link

The question remains how to use the imported data and how to update it, if possible. This undoubtedly requires code but Project Sienas three current sample apps, AdventureWorks, Real Coverage Finder and Personnel Manager, dont appear to use external data sources.

The Project Siena Release Notes were the primary documentation for the app when this post was written. Paul Mather posted on 1/2/20114 an illustrated tutorial for a simple app with a Excel worksheet as the read-only data source: Win8 Apps using Project Siena:

WAMS provides ToDo demo tutorials for Windows 8 (C# and JavaScript), Windows Phone, iOS, Android, HTML, Xamarin.iOS and Xamarin.Android. My five-part Windows Azure Mobile Services Preview Walkthrough expands on the Windows 8 C# tutorial:

Windows Azure Mobile Services Preview WalkthroughPart 1: Windows 8 ToDo Demo Application (C#) Windows Azure Mobile Services Preview WalkthroughPart 2: Authenticating Windows 8 App Users (C#) Windows Azure Mobile Services Preview WalkthroughPart 3: Pushing Notifications to Windows 8 Users (C#) Windows Azure Mobile Services Preview WalkthroughPart 4: Customizing the UI for the Windows Store Windows Azure Mobile Services Preview WalkthroughPart 5: Distributing Your App From the Windows Store

Ive requested the WAMS group to create a demo app and walkthrough for Project Siena. Ill update this post when and if they do so.

Technorati Tags: Microsoft Project Siena,Project Siena,Siena,Windows Azure Mobile Services,WAMS,Windows 8,Windows Store Apps,JavaScript

Tuesday, December 31, 2013 Windows Azure and Cloud Computing Posts for 12/18/2013+

Top Stories This Week:

DH Kass asked Windows Azure, Office 365, Bing and Xbox Live to gain more processing power? in a deck for his Microsoft Cloud Bet?: $11M Land Deal for Giant Data Center article of 12/30/2013 for the VarGuy blog in the Windows Azure Infrastructure and DevOps section. Mingfei Yan (@mingfeiy) posted Announcing Windows Azure Media Services .NET SDK Extensions Version 2.0 on 12/19/2013 in the Windows Azure Blob, Drive, Table, Queue, HDInsight and Media Services section. Carlos Figueira (@carlos_figueira) described new Enhanced Users Feature in Azure Mobile Services in a 12/16/2013 post in the Windows Azure SQL Database, Federations and Reporting, Mobile Services section.

A compendium of Windows Azure, Service Bus, BizTalk Services, Access Control, Caching, SQL Azure Database, and other cloud-computing articles.

Note: This post is updated weekly or more frequently, depending on the availability of new articles in the following sections:

Windows Azure Blob, Drive, Table, Queue, HDInsight and Media Services Windows Azure SQL Database, Federations and Reporting, Mobile Services Windows Azure Marketplace DataMarket, Power BI, Big Data and OData Windows Azure Service Bus, BizTalk Services and Workflow Windows Azure Access Control, Active Directory, and Identity Windows Azure Virtual Machines, Virtual Networks, Web Sites, Connect, RDP and CDN Windows Azure Cloud Services, Caching, APIs, Tools and Test Harnesses Windows Azure Infrastructure and DevOps Windows Azure Pack, Hosting, Hyper-V and Private/Hybrid Clouds Visual Studio LightSwitch and Entity Framework v4+ Cloud Security, Compliance and Governance Cloud Computing Events Other Cloud Computing Platforms and Services
Windows Azure Blob, Drive, Table, Queue, HDInsight and Media Services

Return to section navigation list

Mingfei Yan (@mingfeiy) posted Announcing Windows Azure Media Services .NET SDK Extensions Version 2.0 on 12/19/2013:

I am excited to announce that our Windows Azure Media services Extension SDK version 2.0 just released! If you were using the first version of extension SDK, please pay attention because this version includes a few breaking changes we had a round of redesign for our existing extension APIs.

What is extension SDK?

This extension SDK is a great initiative by developer Mariano Conveti from our partner Southworks. This SDK extension library contains a set of extension methods and helpers for the Windows Azure Media Services SDK for .NET. You could get the source code from Github repository or install Nuget package to start using it.

Why we are doing extension SDK?

As we known, we have our official Windows Azure Media Services SDK, however, because it is very flexible for building customized media workflow, you may have to write lots of lines of code in order to complete a simple function. What if there is an extension allows you to complete basic media workflow in 10 lines? Thats why WAMS extension SDK exists.

Who are maintaining the extension SDK?

This extension SDK is under Azure official code repository on Github. Currently, majority of code is contributed by Mariano. WAMS team is also contributing and is responsible for Github and Nuget release. However, we welcome developer communitys contribution. Please refer to our Github page for more details.

Whats new for version 2.0?

I have another blog on major features for version 1.0. All the features are still working in this new version. For version 2.0, we re-organized extension methods, which avoid defining all of them for CloudMediaContext class. This new extension SDK is based on the latest Windows Azure Media Services 3.0.0.0. Automatically load balance between multiple storage accounts while uploading assets, you could refer to Marianos blog for code example

Here is how to use this extension SDK to complete a basic media workflow:

1. Open Visual Studio and create a .NET console application (it could target on either .NET 4 or .NET 4.5).

2. Right Click on your project and select Manage NuGet package. Navigate to Online tab, and search windowsazure.mediaservices on the Search windows. And select to install Windows Azure Media Services Extension SDK as showed in the picture below.

3. Copy and paste the following code in Main method (it uploads media file, encode it into mulit-bitrate Mp4 and publish for streaming)

try{    MediaServicesCredentials credentials = new MediaServicesCredentials(acc_name, acc_key);    CloudMediaContext context = new CloudMediaContext(credentials);    Console.WriteLine(Creating new asset from local file...);    // 1. Create a new asset by uploading a mezzanine file from a local path.    IAsset inputAsset = context.Assets.CreateFromFile(        @C:\demo\demo.mp4,        AssetCreationOptions.None,        (af, p) =        {            Console.WriteLine(Uploading '{0}' - Progress: {1:0.##}%, af.Name, p.Progress);        });    Console.WriteLine(Asset created.);    // 2. Prepare a job with a single task to transcode the previous mezzanine asset    //    into a multi-bitrate asset.    IJob job = context.Jobs.CreateWithSingleTask(        MediaProcessorNames.WindowsAzureMediaEncoder,        MediaEncoderTaskPresetStrings.H264AdaptiveBitrateMP4Set720p,        inputAsset,        Sample Adaptive Bitrate MP4,        AssetCreationOptions.None);    Console.WriteLine(Submitting transcoding job...);    // 3. Submit the job and wait until it is completed.    job.Submit();    job = job.StartExecutionProgressTask(        j =        {            Console.WriteLine(Job state: {0}, j.State);            Console.WriteLine(Job progress: {0:0.##}%, j.GetOverallProgress());        },        CancellationToken.None).Result;    Console.WriteLine(Transcoding job finished.);    IAsset outputAsset = job.OutputMediaAssets[0];    Console.WriteLine(Publishing output asset...);    // 4. Publish the output asset by creating an Origin locator for adaptive streaming,     // and a SAS locator for progressive download.    context.Locators.Create(        LocatorType.OnDemandOrigin,        outputAsset,        AccessPermissions.Read,        TimeSpan.FromDays(30));    context.Locators.Create(        LocatorType.Sas,        outputAsset,        AccessPermissions.Read,        TimeSpan.FromDays(30));    IEnumerableIAssetFile mp4AssetFiles = outputAsset            .AssetFiles            .ToList()            .Where(af = af.Name.EndsWith(.mp4, StringComparison.OrdinalIgnoreCase));    // 5. Generate the Smooth Streaming, HLS and MPEG-DASH URLs for adaptive streaming,     // and the Progressive Download URL.    Uri smoothStreamingUri = outputAsset.GetSmoothStreamingUri();    Uri hlsUri = outputAsset.GetHlsUri();    Uri mpegDashUri = outputAsset.GetMpegDashUri();    ListUri mp4ProgressiveDownloadUris = mp4AssetFiles.Select(af = af.GetSasUri()).ToList();    // 6. Get the asset URLs.    Console.WriteLine(smoothStreamingUri);    Console.WriteLine(hlsUri);    Console.WriteLine(mpegDashUri);    mp4ProgressiveDownloadUris.ForEach(uri = Console.WriteLine(uri));    Console.WriteLine(Output asset available for adaptive streaming and progressive download.);    Console.WriteLine(VOD workflow finished.);}catch (Exception exception){    // Parse the XML error message in the Media Services response and create a new     // exception with its content.    exception = MediaServicesExceptionParser.Parse(exception);    Console.Error.WriteLine(exception.Message);}finally{    Console.ReadLine();}}

*I put a local file in C:\demo\demo.mp4, please change this file path to link to your file on disk.

There are a few things I want to highlight in this sample code:

Encoding presents could be selected in MediaEncoderTaskPresetStrings, you no longer need to go to MSDN to check for task presents:

There are two types of locator as you can see in Step 4, in fact, this concept isnt new and it exists in WAMS SDK as well. However, this is one of the most asked questions so Id love to explain again. SAS(Shared Access Signature) Locator: this is generated to access video file directly from Blob (as any other files). This is used for progressive download. OnDemandOrigin Locator: the url allows you to access media file through our origin server. In this example, you can see there are 3 OnDemandOrigin Locator gets generated in step 5: MPEG-DASH, Smooth Streaming and HLS. Its utilizing our dynamic packaging feature that by storing your media file in multi-bitrate Mp4, we can deliver multiple media formats on the fly. However, remember you need at least reserved unit for streaming to make it work. Below is the picture for explaining these two locators URLs:

4. Press F5 and run. In the output console, you should be able to see three streaming URLs and a few SAS locators for various bitrates Mp4 files. I have shared project source code for download.

5. If you found any issues or there are APIs you think are helpful but missing in current collection, dont hesitate to post on Github Issue tab. We are looking forward to seeing your feedback.


Serdar Ozler, Mike Fisher and Joe Giardino of the Windows Azure Storage Team announced availability of the Windows Azure Storage Client Library for C++ Preview on 12/19/2013:

We are excited to announce the availability of our new Windows Azure Storage Client Library for C++. This is currently a Preview release, which means the library should not be used in your production code just yet. Instead, please kick the tires and give us any feedback you might have for us to improve/change the interface based on feedback for the GA release. This blog post serves as an overview of the library.

Please refer to SOSP Paper - Windows Azure Storage: A Highly Available Cloud Storage Service with Strong Consistency for more information on Windows Azure Storage.

Emulator Guidance

Please note, that the 2013-08-15 REST version, which this library utilizes, is currently unsupported by the storage emulator. An updated Windows Azure Storage Emulator is expected to ship with full support of these new features in the next month. Users attempting to develop against the current version of the Storage emulator will receive Bad Request errors in the interim. Until then, users wanting to use the new features would need to develop and test against a Windows Azure Storage Account to leverage the 2013-08-15 REST version.

Supported Platforms

In this release, we provide x64 and x86 versions of the library for both Visual Studio 2012 (v110) and Visual Studio 2013 (v120) platform toolsets. Therefore you will find 8 build flavors in the package:

Release, x64, v120 Debug, x64, v120 Release, Win32, v120 Debug, Win32, v120 Release, x64, v110 Debug, x64, v110 Release, Win32, v110 Debug, Win32, v110 Where is it?

The library can be downloaded from NuGet and full source code is available on GitHub. NuGet packages are created using the CoApp tools and therefore consist of 3 separate packages:

wastorage.0.2.0-preview.nupkg: This package contains header and LIB files required to develop your application. This is the package you need to install, which has a dependency on the redist package and thus will force NuGet to install that one automatically. wastorage.redist.0.2.0-preview.nupkg: This package contains DLL files required to run and redistribute your application. wastorage.symbols.0.2.0-preview.nupkg: This package contains symbols for the respective DLL files and therefore is an optional package.

The package also contains a dependency on C++ REST SDK, which will also be automatically installed by NuGet. The C++ REST SDK (codename Casablanca) is a Microsoft project for cloud-based client-server communication in native code and provides support for accessing REST services from native code on multiple platforms by providing asynchronous C++ bindings to HTTP, JSON, and URIs. Windows Azure Storage Client Library uses it to communicate with the Windows Azure Storage Blob, Queue, and Table services.

What do you get?

Here is a summary of the functionality you will get by using Windows Azure Storage Client Library instead of directly talking to the REST API:

Easy-to-use implementations of the entire Windows Azure Storage REST API version 2013-08-15 Retry policies that retry certain failed requests using an exponential or linear back off algorithm Streamlined authentication model that supports both Shared Keys and Shared Authentication Signatures Ability to dive into the request details and results using an operation context and ETW logging Blob uploads regardless of size and blob type, and parallel block/page uploads configurable by the user Blob streams that allow reading from or writing to a blob without having to deal with specific upload/download APIs Full MD5 support across all blob upload and download APIs Table layer that uses the new JSON support on Windows Azure Storage announced in November 2013 Entity Group Transaction support for Table service that enables multiple operations within a single transaction Support for Read Access Geo Redundant Storage

This release has full support for Read Access to the storage account data in the secondary region. This functionality needs to be enabled via the portal for a given storage account, you can read more about RA-GRS here.

How to use it?

After installing the NuGet package, all header files you need to use will be located in a folder named was (stands for Windows Azure Storage). Under this directory, the following header files are critical:

blob.h: Declares all types related to Blob service queue.h: Declares all types related to Queue service table.h: Declares all types related to Table service storage_account.h: Declares the cloud_storage_account type that can be used to easily create service client objects using an account name/key or a connection string retry_policies.h: Declares different retry policies available to use with all operations

So, you can start with including the headers you are going to use:

#include was/storage_account.h
#include was/queue.h
#include was/table.h
#include was/blob.h

Then we will create a cloud_storage_account object, which enables us to create service client objects later in the code. Please note that we are using https below for a secure connection, but http is very useful when you are debugging your application.

wa::storage::cloud_storage_account storage_account = wa::storage::cloud_storage_account::parse(U(AccountName=account_name;AccountKey=account_key;DefaultEndpointsProtocol=https));
Blobs

Here we create a blob container, a blob with some text in it, download it, and then list all the blobs in our container:

// Create a blob container
wa::storage::cloud_blob_client blob_client = storage_account.create_cloud_blob_client();
wa::storage::cloud_blob_container container = blob_client.get_container_reference(U(mycontainer));
container.create_if_not_exists();

// Upload a blob
wa::storage::cloud_block_blob blob1 = container.get_block_blob_reference(U(myblob));
blob1.upload_text(U(some text));

// Download a blob
wa::storage::cloud_block_blob blob2 = container.get_block_blob_reference(U(myblob));
utility::string_t text = blob2.download_text();

// List blobs
wa::storage::blob_result_segment blobs = container.list_blobs_segmented(wa::storage::blob_continuation_token());
Tables

The sample below creates a table, inserts an entity with couple properties of different types, and finally retrieves that specific entity. In the first retrieve operation, we do a point query and retrieve the specific entity. In the query operation, on the other hand, we query all entities with PartitionKey is equal to partition and RowKey is greater than or equal to m, which will eventually get us the original entity we inserted.

For more information on Windows Azure Tables, please refer to the Understanding the Table Service Data Model article and the How to get most out of Windows Azure Tables blog post.

// Create a table
wa::storage::cloud_table_client table_client = storage_account.create_cloud_table_client();
wa::storage::cloud_table table = table_client.get_table_reference(U(mytable));
table.create_if_not_exists();

// Insert a table entity
wa::storage::table_entity entity(U(partition), U(row));
entity.properties().insert(wa::storage::table_entity::property_type(U(PropertyA), wa::storage::table_entity_property(U(some string))));
entity.properties().insert(wa::storage::table_entity::property_type(U(PropertyB), wa::storage::table_entity_property(utility::datetime::utc_now())));
entity.properties().insert(wa::storage::table_entity::property_type(U(PropertyC), wa::storage::table_entity_property(utility::new_uuid())));
wa::storage::table_operation operation1 = wa::storage::table_operation::insert_or_replace_entity(entity);
wa::storage::table_result table_result = table.execute(operation1);

// Retrieve a table entity
wa::storage::table_operation operation2 = wa::storage::table_operation::retrieve_entity(U(partition), U(row));
wa::storage::table_result result = table.execute(operation2);

// Query table entities
wa::storage::table_query query;
query.set_filter_string(wa::storage::table_query::combine_filter_conditions(
wa::storage::table_query::generate_filter_condition(U(PartitionKey), wa::storage::query_comparison_operator::equal, U(partition)),
wa::storage::query_logical_operator::and,
wa::storage::table_query::generate_filter_condition(U(RowKey), wa::storage::query_comparison_operator::greater_than_or_equal, U(m))));
std::vectorwa::storage::table_entity results = table.execute_query(query);
Queues

In our final example, we will create a queue, add a message to it, retrieve the same message, and finally update it:

// Create a queue
wa::storage::cloud_queue_client queue_client = storage_account.create_cloud_queue_client();
wa::storage::cloud_queue queue = queue_client.get_queue_reference(U(myqueue));
queue.create_if_not_exists();

// Add a queue message
wa::storage::cloud_queue_message message1(U(mymessage));
queue.add_message(message1);

// Get a queue message
wa::storage::cloud_queue_message message2 = queue.get_message();

// Update a queue message
message2.set_content(U(changedmessage));
queue.update_message(message2, std::chrono::seconds(30), true);
How to debug it?

When things go wrong, you might get an exception from one of your calls. This exception will be of type wa::storage::storage_exception and contain detailed information about what went wrong. Consider the following code:

try
{
blob1.download_attributes();
}
catch (const wa::storage::storage_exception e)
{
std::cout Exception: e.what() std::endl;
ucout U(The request that started at ) e.result().start_time().to_string() U( and ended at ) e.result().end_time().to_string() U( resulted in HTTP status code ) e.result().http_status_code() U( and the request ID reported by the server was ) e.result().service_request_id() std::endl;
}

When run on a non-existing blob, this code will print out:

Exception: The specified blob does not exist.

The request that started at Fri, 13 Dec 2013 18:31:11 GMT and ended at Fri, 13 Dec 2013 18:31:11 GMT resulted in HTTP status code 404 and the request ID reported by the server was 5de65ae4-9a71-4b1d-9c99-cc4225e714c6

The library also provides the type wa::storage::operation_context, which is supported by all APIs, to obtain more information about what is being done during an operation. Now consider the following code:

wa::storage::operation_context context;
context.set_sending_request([] (web::http::http_request request, wa::storage::operation_context)
{
ucout U(The request is being sent to ) request.request_uri().to_string() std::endl;
});

context.set_response_received([] (web::http::http_request, const web::http::http_response response, wa::storage::operation_context)
{
ucout U(The reason phrase is ) response.reason_phrase() std::endl;
});

try
{
blob1.download_attributes(wa::storage::access_condition(), wa::storage::blob_request_options(), context);
}
catch (const wa::storage::storage_exception e)
{
std::cout Exception: e.what() std::endl;
}
ucout U(Executed ) context.request_results().size() U( request(s) to perform this operation and the last request's status code was ) context.request_results().back().http_status_code() std::endl;

Again, when run on a non-existing blob, this code will print out:

The request is being sent to http://myaccount.blob.core.windows.net/mycontainer/myblob?timeout=90

The reason phrase is The specified blob does not exist.

Exception: The specified blob does not exist.

Executed 1 request(s) to perform this operation and the last request's status code was 404

Samples

We have provided sample projects on GitHub to help get you up and running with each storage abstraction and to illustrate some additional key scenarios. All sample projects are found under the folder named samples.

Open the samples solution file named Microsoft.WindowsAzure.Storage.Samples.sln in Visual Studio. Update your storage credentials in the samples_common.h file under the Microsoft.WindowsAzure.Storage.SamplesCommon project. Go to the Solution Explorer window and select the sample project you want to run (for example, Microsoft.WindowsAzure.Storage.BlobsGettingStarted) and choose Set as StartUp Project from the Project menu (or, alternatively, right-click the project, then choose the same option from the context menu).

Summary

We welcome any feedback you may have in the comments section below, the forums, or GitHub. If you hit any bugs, filing them on GitHub will also allow you to track the resolution.

Serdar Ozler, Mike Fisher, and Joe Giardino

Resources Source (GitHub) Binaries (NuGet) Windows Azure Storage Release - Introducing CORS, JSON, Minute Metrics, and More Windows Azure Tables: Introducing JSON


Angshuman Nayak described how to fix a IIS Logs stops writing in cloud service problem in a 12/21/2013 post to the Windows Azure Cloud Integration Engineering blog:

I was recently working on a case where the IIS logs stops writing on all Production boxes. There was 995GB on C:\ disk for logs, but logs writing starts only after some files were deleted. The Current IIS logs directory size was 2.91 GB (3,132,727,296 bytes).

So I looked at the diagnostic.wadcfg files.

?xml version=1.0 encoding=utf-8?

DiagnosticMonitorConfiguration configurationChangePollInterval=PT1M overallQuotaInMB=4096 xmlns=http://schemas.microsoft.com/ServiceHosting/2010/10/DiagnosticsConfiguration

DiagnosticInfrastructureLogs scheduledTransferPeriod=PT1M /

Directories scheduledTransferPeriod=PT1M

IISLogs container=wad-iis-logfiles /

CrashDumps container=wad-crash-dumps /

/Directories

Logs bufferQuotaInMB=1024 scheduledTransferPeriod=PT1M /

PerformanceCounters bufferQuotaInMB=512 scheduledTransferPeriod=PT1M

PerformanceCounterConfiguration counterSpecifier=\Memory\Available MBytes sampleRate=PT3M /

PerformanceCounterConfiguration counterSpecifier=\Web Service(_Total)\ISAPI Extension Requests/sec sampleRate=PT3M /

PerformanceCounterConfiguration counterSpecifier=\Web Service(_Total)\Bytes Total/Sec sampleRate=PT3M /

PerformanceCounterConfiguration counterSpecifier=\ASP.NET Applications(__Total__)\Requests/Sec sampleRate=PT3M /

PerformanceCounterConfiguration counterSpecifier=\ASP.NET Applications(__Total__)\Errors Total/Sec sampleRate=PT3M /

PerformanceCounterConfiguration counterSpecifier=\ASP.NET\Requests Queued sampleRate=PT3M /

PerformanceCounterConfiguration counterSpecifier=\ASP.NET\Requests Rejected sampleRate=PT3M /

PerformanceCounterConfiguration counterSpecifier=\Processor(_Total)\% Processor Time sampleRate=PT1S /

/PerformanceCounters

WindowsEventLog bufferQuotaInMB=1024 scheduledTransferPeriod=PT1M scheduledTransferLogLevelFilter=Error

DataSource name=Application!* /

DataSource name=System!* /

/WindowsEventLog

/DiagnosticMonitorConfiguration

I noticed that there is no bufferQuotaInMB mentioned for Directories and IIS logs.

Directories scheduledTransferPeriod=PT1M

IISLogs container=wad-iis-logfiles /

CrashDumps container=wad-crash-dumps /

/Directories

The attribute bufferQuotaInMB needs to be mentioned as per this article http://www.windowsazure.com/en-us/develop/net/common-tasks/diagnostics/ . Making this change led to the IIS logs getting moved to the Azure Storage BLOB location at scheduled intervals and making space for new IIS logs to be collected.

Since there was lot of disk space I further recommend the customer to increase the quota limits. The way it needs to be done is as below

a) Increase the Local Resource Size. If this is not done then irrespective of what value is allocated to OverallQuotaInMB the value it will get in 4GB.

LocalResources

LocalStorage name=DiagnosticStore cleanOnRoleRecycle=falsesizeInMB=8192 / == Example of Setting 8GB DiagnosticStore

/LocalResources

b) Keep the OverallQuotaInMB at least 512 KB less than the value above. We need 512 KB for the GC to run and incase all the space is consumed the GC cant run to move the logs.

c) The BufferQuotaInMB property sets the amount of local storage to allocate for a specified data buffer. You can specify the value of this property when configuring any of the diagnostic sources for a diagnostic monitor. By default, the BufferQuotaInMB property is set to 0 for each data buffer. Diagnostic data will be written to each data buffer until the OverallQuotaInMB limit is reached unless you set a value for this property. So the space left after the sum of the all the directory quota is deducted from OverallQuotaInMB is allocated to BufferQuota.

d) Also change the ScheduledTransferPeriodInMinutes to a non 0 value. The minimum is 1 minute, if its set to 0 then the diagnostics will not be moved to storage and you might lose them if the VM is re-incarnated anytime due to a guest or host O.S update.


Carl Nolan (@carl_nolan) described Managing Your HDInsight Cluster using PowerShell Update in a 12/26/2013 post:

Since writing my last post Managing Your HDInsight Cluster and .Net Job Submissions using PowerShell, there have been some useful modifications to the Azure PowerShell Tools.

The HDInsight cmdlets no longer exist as these have now been integrated into the latest release of the Windows Azure Powershell Tools. This integration means:

You dont need to specify Subscription parameter If needed, you can use AAD authentication to Azure instead of certificates

Also, the cmdlets are fully backwards compatible meaning you dont need to change your current scripts. The Subscription parameter is now optional but if it specified then it is honoured.

As such the cluster creation script will now be:

Param($Hosts = 4, [string] $Cluster = $(throw Cluster Name Required.), [string] $StorageContainer = hadooproot) # Get the subscription information and set variables $subscriptionInfo = Get-AzureSubscription -Default $subName = $subscriptionInfo | %{ $_.SubscriptionName } $subId = $subscriptionInfo | %{ $_.SubscriptionId } $cert = $subscriptionInfo | %{ $_.Certificate } $storeAccount = $subscriptionInfo | %{ $_.CurrentStorageAccountName } Select-AzureSubscription -SubscriptionName $subName $key = Get-AzureStorageKey $storeAccount | %{ $_.Primary } $storageAccountInfo = Get-AzureStorageAccount $storeAccount $location = $storageAccountInfo | %{ $_.Location } $hadoopUsername = Hadoop $clusterUsername = Admin $clusterPassword = myclusterpassword $secpasswd = ConvertTo-SecureString $clusterPassword -AsPlainText -Force $clusterCreds = New-Object System.Management.Automation.PSCredential($clusterUsername, $secpasswd) $clusterName = $Cluster $numberNodes = $Hosts $containerDefault = $StorageContainer $blobStorage = $storeAccount.blob.core.windows.net # tidyup the root to ensure empty # Remove-AzureStorageContainer Name $containerDefault -Force Write-Host Deleting old storage container contents: $containerDefault -f yellow $blobs = Get-AzureStorageBlob -Container $containerDefault foreach($blob in $blobs) { Remove-AzureStorageBlob -Container $containerDefault -Blob ($blob.Name) } # Create the cluster Write-Host Creating '$numberNodes' Node Cluster named: $clusterName -f yellow Write-Host Storage Account '$storeAccount' and Container '$containerDefault' -f yellow Write-Host User '$clusterUsername' Password '$clusterPassword' -f green New-AzureHDInsightCluster -Certificate $cert -Name $clusterName -Location $location -DefaultStorageAccountName $blobStorage -DefaultStorageAccountKey $key -DefaultStorageContainerName $containerDefault -Credential $clusterCreds -ClusterSizeInNodes $numberNodes Write-Host Created '$numberNodes' Node Cluster: $clusterName -f yellow

The only changes are the selection of the subscription and the removal of the Subscription option when creating the cluster. Of course all the other scripts can easily be modified along the same lines; such as the cluster deletion script:

Param($Cluster = $(throw Cluster Name Required.)) # Get the subscription information and set variables $subscriptionInfo = Get-AzureSubscription -Default $subName = $subscriptionInfo | %{ $_.SubscriptionName } $subId = $subscriptionInfo | %{ $_.SubscriptionId } $cert = $subscriptionInfo | %{ $_.Certificate } $storeAccount = $subscriptionInfo | %{ $_.CurrentStorageAccountName } Select-AzureSubscription -SubscriptionName $subName $clusterName = $Cluster # Delete the cluster Write-Host Deleting Cluster named: $clusterName -f yellow Remove-AzureHDInsightCluster $clusterName -Subscription $subId -Certificate $cert Write-Host Deleted Cluster $clusterName -f yellow

This cluster creation script also contains an additional section to ensure that the default storage container does not container any leftover files, from previous cluster creations:

$blobs = Get-AzureStorageBlob -Container $containerDefault foreach($blob in $blobs) { Remove-AzureStorageBlob -Container $containerDefault -Blob ($blob.Name) }

The rationale behind this is to ensure that any files that may be left over from previous clusters creations/deletions are removed.


Return to section navigation list

Windows Azure SQL Database, Federations and Reporting, Mobile Services

Carlos Figueira (@carlos_figueira) described new Enhanced Users Feature in Azure Mobile Services in a 12/16/2013 post:

In the last post I talked about the ability to get an access token with more permissions to talk to the authentication provider API, so we made it that feature more powerful. Last week we released a new update to the mobile services that makes some of that easier. With the new (preview) enhanced users feature, expanded data about the logged in user to the mobile service is available directly at the service itself (via a call to user.getIdentities), so theres no more need to talk to the authentication provider APIs to retrieve additional data from the user. Lets see how we can do that.

To opt-in to this preview feature, youll need the Command-Line Interface. If you havent used it before, I recommend checking out its installation and usage tutorial to get acquainted with its basic commands. Once you have it installed, and the publishing profile for your subscription properly imported, we can start with that. For this blog post, I created a new mobile service called blog20131216, and as a newly created service, it doesnt have any of the preview features enabled, which we can see with the azure mobile preview list command, which lists the new feature we have:

C:\tempazure mobile preview list blog20131216
info: Executing command mobile preview list
+ Getting preview features
data: Preview feature Enabled
data: --------------- -------
data: SourceControl No
data: Users No
info: You can enable preview features using the 'azure mobile preview enable' command.
info: mobile preview list command OK

A sample app

Lets see how the feature behaves before we turn the feature on. As usual, Ill create a simple app which will demonstrate how this works, and Ill post it to my blogsamples repository on GitHub, under AzureMobileServices/UsersFeature. This is an all in which we can log in to certain providers, and call a custom API or a table script. Heres the app:

The API implementation does nothing but return the result of the call to user.getIdentities in its response.

exports.get = function (request, response) { request.user.getIdentities({ success: function (identities) { response.send(statusCodes.OK, identities); } }); };

And the table script will do the same (bypassing the database):

function read(query, user, request) { user.getIdentities({ success: function (identities) { request.respond(200, identities); } }); }

Now, if we run the app, login with all providers and call the API and table scripts, we get what we currently receive when calling the getIdentities method:

Logged in as Facebook:my-fb-id
API result: {
facebook: {
userId: Facebook:my-fb-id,
accessToken: the long access token
}
}
Logged out
Logged in as Google:my-google-id
Table script result: {
google: {
userId: Google:my-google-id,
accessToken: the access token
}
}
Logged out
Logged in as MicrosoftAccount:my-microsoft-id
API result: {
microsoft: {
userId: MicrosoftAccount:my-microsoft-id,
accessToken: the very long access token
}
}

So for all existing services nothing really changes.

Enabling the users feature

Now lets see what starts happening if we enable that new preview feature.

Caution: just like the source control preview feature, the users feature cannot be disabled once turned on. I'd strongly suggest you to try the feature in a test mobile service prior to enabling it in a service being used by an actual mobile application.

In the Command-Line interface, lets enable the users feature:

C:\tempazure mobile preview enable blog20131216 Users
info: Executing command mobile preview enable
+ Enabling preview feature for mobile service
info: Result of enabling feature:
info: Successfully enabled Users feature for Mobile Service 'blog20131216'
info: mobile preview enable command OK

And just like that (after a few seconds), the feature is enabled:

C:\tempazure mobile preview list blog20131216
info: Executing command mobile preview list
+ Getting preview features
data: Preview feature Enabled
data: --------------- -------------
data: Users Yes
data: SourceControl No
info: You can enable preview features using the 'azure mobile preview enable' command.
info: mobile preview list command OK

Time now to run the same test as before. And the result is a lot more interesting. For Facebook:

Logged in as Facebook:my-fb-id
API result: {
facebook: {
id: my-fb-id,
username: my-facebook-username,
name: Carlos Figueira,
gender: male,
first_name: Carlos,
last_name: Figueira,
link: https://www.facebook.com/my-facebook-username,
locale: en_US,
accessToken: the-long-access-token,
userId: Facebook:my-fb-id
}
}

And Google:

Logged in as Google:my-google-id
Table script result: {
google: {
family_name: Figueira,
given_name: Carlos,
locale: en,
name: Carlos Figueira,
picture: https://lh3.googleusercontent.com/some-path/some-other-path/yet-another-path/and-one-more-path/photo.jpg,
sub: my-google-id,
accessToken: the-access-token,
userId: Google:my-google-id
}
}

And Microsoft:

Logged in as MicrosoftAccount:my-microsoft-id
API result: {
microsoft: {
id: my-microsoft-id,
name: Carlos Figueira,
first_name: Carlos,
last_name: Figueira,
link: https://profile.live.com/,
gender: null,
locale: en_US,
accessToken: the very long access token
userId: MicrosoftAccount:my-microsoft-id
}
}

So thats basically what we get now for free with this new feature One more thing I didnt include Twitter in the example, but it works just as well as the other providers: youll get information such as the screen name in the user identities.

User.getIdentities change

One thing I only showed but didnt really talk about: the call to user.getIdentities is different than how it was previously used. Its a change required to support the new functionality, and the synchronous version may eventually be deprecated. In order to give you more information about the users, were now storing that information in a database table, and retrieving that data is something we cannot do synchronously. What we need to change in the scripts then is to pass a callback function to the users.getIdentities call, and when that data is retrieved the success callback will be invoked. Changing the code to comply with the new mode is fairly straightforward, as the code below shows. Heres the before code:

exports.get = function (request, response) { var identities = request.user.getIdentities(); // Do something with identities, send response }

And with the change we just need to move the remaining of the code to a callback function:

exports.get = function (request, response) { request.user.getIdentities({ success: function (identities) { // Do something with identities, send response } }); }

The synchronous version of user.getIdentities will continue working in mobile services where the users feature is not enabled (since we cant break existing services), but I strongly recommend changing your scripts to use the asynchronous version to be prepared for future changes.

Wrapping up

As I mentioned this is a preview feature, which means its something we intend on promoting to GA (general availability) level, but wed like to get some early feedback so we can better align it with what the customers want. Feel free to give us feedback about this feature and what we can do to improve it.

No significant articles so far this week.


Return to section navigation list

Windows Azure Marketplace DataMarket, Cloud Numerics, Big Data and OData
No significant articles so far this week.

Return to section navigation list

Windows Azure Service Bus, BizTalk Services and Workflow

Sam Vanhoutte (@SamVanhoutte) described Step by step debugging of bridges in Windows Azure BizTalk Services in a 12/15/2013 post:

Introduction

The BizTalk product team has recently published BizTalk Service Explorer to the Visual Studio Gallery. It can be downloaded by clicking this link. The tool is still an alpha version and will keep being improved over time. This explorer adds a node to Visual Studio Server Explorer through which a BizTalk Service can be managed. The following capabilities are available:

Browsing of all artifacts (endpoints, bridges, schemas, assemblies, transformations, certificates) See tracked events of bridges Upload and download artifacts Start stop endpoints Restarting the services Testing and debugging bridges Testing capabilities

This blog post will be about that last feature, the testing and debugging of bridges. With the tool, its much easier to test bridges and to get insight in the actual processing and the state of messages at the various stages. It is not (yet?) possible to have step through debugging of custom code.

Sending test messages

Where we were making our own testing tools, or we used the MessageSender console app, we can now have integrate testing capabilities in Visual Studio. To do this, you can just right click a bridge and send a test message to it.

After doing so, you get a simple dialog box where you can have your bridge tested.

Debugging bridges

This feature is very neat. Here you can proceed step by step through a BizTalk bridge and see how the message and the context changes from stage to stage.

How it works? Through service bus relay! First of all, you have to configure a service bus namespace with the right credentials. This service bus namespace can be an existing. Configuring this can be done on the properties of the service node:
Now this is done, the explorer will start up a service bus relayed web service that will be called from the actual runtime. Debug a bridge

To debug a bridge, you just have to right click it and select the debug option. This will open a new window, as depicted below. When stepping through the pipeline, the message and the context of the current and the previous stage are visible side by side, so that it can easily be compared. This is ideal for seeing the mapping results, effects of custom message inspectors and responses of LOB systems

Be aware that, when a bridge is in debug mode, messages from other consumers can also be reflected and visualized in the tool. (I tested this by sending a message through the tool and by sending another one outside of Visual Studio) So, dont use this on bridges in production!

Feedback for next versions

This tool really makes it more productive and intuitive to test bridges and the behavior of BizTalk Services. I really hope it will continue to evolve and therefore I want to give some feedback here:

I would love to see the HTTP header values in the test or debug tool. For bridge debugging, it would be nice to right click a stage and to specify something like : continue to here. This would make it even faster to test. The tracking of events should be better visualized. Instead of showing just a list of Guids, I would love to see timestamps at least and I would love to see the tracked items in a better format It would also be nice if you could link one (or more?) test files with a specific bridge in the explorer. Then it would be faster to test. Another nice to have would be to drag my test message onto the bridge node to open the test or debug window, pre filled with the dropped file.


Return to section navigation list

Windows Azure Access Control, Active Directory, Identity and Workflow

Yossi Dahan (@YossiDahan) described Role-based authorisation with Windows Azure Active Directory in an 11/28/2013 post (missed when published):

Using Window Azure Active Directory (WaaD) to provide authentication for web applications hosted anywhere is dead simple. Indeed in Visual Studio 2013 it is a couple of steps in the project creation wizard.

This provides means of authentication but not authorisation, so what does it take to support authorisation is the [Authorise(Role=xxx)] approach?

Conceptually, whats needed is means to convert information from claims to role information about the logged-on user.

WaaD supports creating Groups and assigning users to them, which is whats needed to drive role-based authorisation, the problem is that, somewhat surprising perhaps, group membership is not reflected in the claim-set delivered to the application as part of the ws-federation authentication.

Fortunately, getting around this is very straight forward. Microsoft actually published a good example here but I found it a little bit confusing and wanted to simplify the steps somewhat to explain the process more clearly, hopefully Ive succeeded

The process involves extracting the claims principal from the request and, from the provided claims, find the WaaD tenant.

With that and with prior knowledge of the clientId and key for the tenant (exchanged out of band and kept securely, of course) the WaaD GraphAPI can be used to query the group membership of the user

Finally the groups can be used to add role claims to the claim-set, which WIF would automatically populate as roles allowing the program to use IsInRole and the [Authorise] attribute as it would normally.

So how is all of this done?

The key is to add a ClaimsAuthenticationManager, which will get invoked when an authentication response is detected, and in it perform the steps described.

A slightly simplified (as opposed to better!) version of the sample code is as follows

 public override ClaimsPrincipal Authenticate(string resourceName, ClaimsPrincipal incomingPrincipal)        {            //only act if a principal exists and is authenticated            if (incomingPrincipal != null  incomingPrincipal.Identity.IsAuthenticated == true)            {                //get the Windows Azure Active Directory tenantId                string tenantId = incomingPrincipal.FindFirst(http://schemas.microsoft.com/identity/claims/tenantid).Value;                // Use the DirectoryDataServiceAuthorizationHelper graph helper API                // to get a token to access the Windows Azure AD Graph                string clientId = ConfigurationManager.AppSettings[ida:ClientID];                string password = ConfigurationManager.AppSettings[ida:Password];                //get a JWT authorisation token for the application from the directory                 AADJWTToken token = DirectoryDataServiceAuthorizationHelper.GetAuthorizationToken(tenantId, clientId, password);                // initialize a graphService instance. Use the JWT token acquired in the previous step.                DirectoryDataService graphService = new DirectoryDataService(tenantId, token);                // get the user's ObjectId                String currentUserObjectId = incomingPrincipal.FindFirst(http://schemas.microsoft.com/identity/claims/objectidentifier).Value;                // Get the User object by querying Windows Azure AD Graph                User currentUser = graphService.directoryObjects.OfTypeUser().Where(it = (it.objectId == currentUserObjectId)).SingleOrDefault();                // load the memberOf property of the current user                graphService.LoadProperty(currentUser, memberOf);                //read the values of the memberOf property                ListGroup currentRoles = currentUser.memberOf.OfTypeGroup().ToList();                //take each group the user is a member of and add it as a role                foreach (Group role in currentRoles)                {                    ((ClaimsIdentity)incomingPrincipal.Identity).AddClaim(new Claim(ClaimTypes.Role, role.displayName, ClaimValueTypes.String, SampleApplication));                }            }            return base.Authenticate(resourceName, incomingPrincipal);        }

You can follow the comments to pick up the actions in the code; in broad terms the identity and the tenant id are extracted from the token, the clientid and key are read from the web.config (VS 2013 puts them there automatically, which is very handy!), an authorisation token is retrieved to support calls to the graph API and the graph service is then used to query the user and its group membership from WaaD before converting, in this case, all groups to role claims.

To use the graph API I used the Graph API helper source code as pointed out here. in Visual Studio 2013 I updated the references to Microsoft.Data.Services.Client and Microsoft.Data.OData to 5.6.0.0.

Finally, to plug in my ClaimsAuthenticationManager to the WIF pipeline I added this bit of configuration

  system.identityModel    identityConfiguration          claimsAuthenticationManager
type=WebApplication5.GraphClaimsAuthenticationManager,WebApplication5 /

With this done the ClaimsAuthenticationManager kicks in after the authentication and injects the role claims, WIFs default behaviour then does its magic and in my controller I can use, for example

        [Authorize(Roles=Readers)]        public ActionResult About()        {            ViewBag.Message = Your application description page.;            return View();        }

Cross posted on the Solidsoft blog.


Return to section navigation list

Windows Azure Virtual Machines, Virtual Networks, Web Sites, Connect, RDP and CDN

My (@rogerjenn) Microsoft Finally Licenses Remote Desktop Services (RDS) on Windows Azure Virtual Machines post included these new updates from 12/21 to 12/27/2013:

Update 12/27/2013 with how to Setup a Windows Server 2012 R2 Domain Controller in Windows Azure: IP Addressing and Creating a Virtual Network from the Petri IT Knowledgebase Update 12/25/2013 with new Windows Azure Desktop Hosting - Reference Architecture and Deployment Guides updated 10/31/2013 from the MSDN Library Update 12/24/2013 with links to the Remote Desktop Services Blog, which offers useful articles and links to related RDS resources, and the Windows 8.1 ITPro Forum Update 12/23/2013 with an updated Remote Desktop Client for Windows 8 and 8.1, which you can download and install from the Windows Store Update 12/22/2013 wity Microsoft Hostings announcement of an Updated SPLA Program Guide and New program: Server Cloud Enrollment in a 12/20/2013 message Update 12/21/2013 with links to Remote Desktop Services Overview from TechNet updated for Windows Server 2012 R2 and Keith Mayers RDS on Window Azure tutorial

Mark Brown posted Using Django, Python, and MySQL on Windows Azure Web Sites: Creating a Blog Application on 12/17/2013:

Editor's Note: This post comes from Sunitha Muthukrishna (@mksuni, pictured here), Program Manager on the Windows Azure Web Sites team

Depending on the app you are writing, the basic Python stack on Windows Azure Web Sites might meet your needs as-is, or it might not include all the modules or libraries your application may need.

Never fear, because in this blog post, Ill take you through the steps to create a Python environment for your application by using Virtualenv and Python Tools for Visual Studio. Along the way, Ill show you how you can put your Django-based site on Windows Azure Web Sites.

Create a Windows Azure Web Site with a MySQL database

Next, log in to the Azure Management Portal and create a new web site using the Custom create option. For more information, See How to Create Azure Websites . Well create an empty website with a MySQL database.

Finally choose a region and, after you choose to accept the site Terms, you can complete the install. As usual, its a good idea to put your database in the same Region as your web site to reduce costs.

Double click on your Website in the Management portal to view the websites dashboard. Click on Download publish profile. This will download a .publishsettings file that can be used for deployment in Visual Studio.

Create a Django project

For this tutorial we will be using Visual Studio to build our Django Web Application. To build your application with Visual studio, install PTVS 2.0 . For more details, see How to build Django Apps with Visual Studio

Open Visual Studio and create a New Project Other Languages Python Django Project


In the solution explorer, create a new Django Application your Django Project by right clicking on DjangoProject Add DjangoApp


Enter the name of your Django application, say myblog

Create a virtual environment

Simply put, virtualenv allows you to create custom, isolated Python environments. That is, you can customize and install different packages without impacting the rest of your site. This makes it useful for experimentation as well.

In the Solution explorer, Right click on Python Environments in your Django Project and Select Add Virtual Environment

Enter the virtual environment name, say env . This will create a folder called env which will contain your virtual python environment without any python packages except for pip

Install MySQL-Python and Django packages in your virtual environment

In the solution explorer, Right-click on environment env and Install Python package: django


You can see the Output of the installation of Django in your virtual environment


Similarly, you need to install mysql-python, but use easy_install instead of pip as seen here

Now you have both Django and MySQL for Python installed in your virtual environment

Build your Database Models

A model in Django is a class that inherits from Django Model class and lets you specify all the attributes of a particular object. Model class translates its properties into values stored in the database.

Lets create a simple model called Post with three fields: title, date and body to build a post table in my database. To create a model, include a models.py file under the myblog/ folder.

#import from Model class

from django.db import models

class Post(models.Model):

#Create a title property

title = models.CharField(max_length=64)

#Create a date property

date = models.DateTimeField()

#Create a body of content property

body = models.TextField()

# This method is just like toString() function in .NET. Whenever Python needs to show a

#string representation of an object, it calls __str__

def __str__(self):

return %s % (self.title)

Installing the Models

Well need to tell Django to create the model in the database. To do this, we need to do a few more things:

First, well configure the applications database in settings.py. Enter the MySQL database information associated with the Windows Azure Web Site.

DATABASES = {

'default': {

'ENGINE': 'django.db.backends.mysql',

'NAME': 'MYSQL-DATABASE-NAME',

'USER': 'MYSQL-SERVER-USER-NAME',

'PASSWORD': 'MYSQL-SERVER-USER-PASSWORD',

'HOST': 'MySQL-SERVER-NAME',

'PORT': '',

}

}

Next, add your application to your INSTALLED_APPS setting in settings.py.

INSTALLED_APPS = (

'django.contrib.auth',

'django.contrib.contenttypes',

'django.contrib.sessions',

'django.contrib.sites',

'django.contrib.messages',

'django.contrib.staticfiles',

'myblog',

)

Once weve saved the settings in settings.py, well create the schema in your Clear DB database for the models weve already added to models.py. This can be achieved Run Django Sync DB

You can write your own code to manage creating, editing, deleting posts for your blog or you can use the administration module Django offers which provides an Admin site dashboard to create and manage posts. Refer to this article on how to enable a Django admin site.

Setup a Django Admin site

The admin site will provide a dashboard to create and manage blog posts. First, we need to create a superuser who can access the admin site. To do this run this command if you havent created an admin user already.

Python manage.py createsuperuser

You can use the Django Shell to run this command. For more details on how to use Django Shell refer this article

The admin module is not enabled by default, so we would need to do the following few steps:

First well add 'django.contrib.admin' to your INSTALLED_APPS setting in settings.py

INSTALLED_APPS = (

'django.contrib.auth',

'django.contrib.contenttypes',

'django.contrib.sessions',

'django.contrib.sites',

'django.contrib.messages',

'django.contrib.staticfiles',

'myblog',

)

Now, well update urls.py to process a request for the application to the admin site and to the home page view.

from django.conf.urls import patterns, include, url

#import admin module

from django.contrib import admin

admin.autodiscover()

#set url patterns to handle requests made to your application

urlpatterns = patterns('',

url(r'^$', 'DjangoApplication.views.home', name='home'),

url(r'^admin/', include(admin.site.urls)),

)

Next, well create an admin.py under the myblog/ folder to register Post model

from models import Post

from django.contrib import admin

#Register the database model so it will be visible in the Admin site

admin.site.register(Post)

Build a Page view

Well create a to list all the blog posts you have created. To create a page view, include a views.py file under the myblog/ folder

from django.shortcuts import render_to_response

from models import Post

#Creating a view for the home page

def home(request):

posts = Post.objects.all()

#Renders a given template with a given context dictionary and

returns an

#HttpResponse object with that rendered text.

return render_to_response('home.html', {'posts': posts} )

Displaying a Post object is not very helpful to the users and we need a more informative page to show a list of Posts. This is a case where a template is helpful. Usually, templates are used for producing HTML, but Django templates are equally capable of generating any text-based format.

To create this template, first well create a directory called templates under myblog/. To display the all the posts in views.py, create a home.html under the templates/ folder which loops through all the post objects and displays them.

html

headtitleMy Blog/title/head

body

h1My Blog/h1

{% for post in posts %}

h1{{ post.title }}/h1

em time datetime={{ post.date.isoformat }}

{{ post.date}}/time br/

/em

p{{ post.body }}/p

{% endfor %}

/body

/html

Set static directory path

If you access the admin site now, you will noticed that the style sheets are broken. The reason for this is the static directory is not configured for the application.

Lets set the static Root folder path to D:\home\site\wwwroot\static

from os import path

PROJECT_ROOT = path.dirname(path.abspath(path.dirname(__file__)))

STATIC_ROOT = path.join(PROJECT_ROOT, 'static').replace('\\','/')

STATIC_URL = '/static/'

Once weve saved these changes in settings.py, run this command to collect all the static files in static folder for the Admin site using Django Shell

Python manage.py collectstatic

Set Template Directory path

Were nearly done! Django requires the path to the templates directory and static folder directory to be configured in settings.py. Theres just a couple of steps needed to do that.

Lets create a variable for the path of the SITE_ROOT

import os.path

SITE_ROOT = os.path.dirname(__file__)

Then, well set the path for Templates folder. TEMPLATES_DIR informs Django where to look for templates for your application when a request is made.

TEMPLATE_DIRS = (

os.path.join(SITE_ROOT, templates),)

Deploy the application

We are now all set to deploy the application to Windows Azure website, mydjangoblog .Right click youre DjangoProject and select Publish

You can validate the connection and then click on publish to initiate deployment. Once deployment is successfully completed, you can browse your website to create your first blog.

Create your blog post

To create your blog, login to the admin site http://mydjangoblog.azurewebsites.net/admin with the super user credentials we created earlier.

The Dashboard will include a Link for your model and will allow you to manage the content for the model used by your application. Click on Posts

Create your first blog post and Save

Lets browse the sites home page, to view the newly created post.

Now you have the basics for building just what you need in Python on Windows Azure Web Sites. Happy coding :)

Further Reading Django Project Python Tools for Visual Studio Wiki Video tutorials for Python Tools for Visual Studio Windows Azure Websites (WAWS)


Anton Staykov (@astaykov) posted Windows Azure secrets of a Web Site on 12/20/2013:

Windows Azure Web Sites are, I would say, the highest form of Platform-as-a-Service. As per documentation The fastest way to build for the cloud. It really is. You can start easy and fast in a minutes will have your Web Site running in the cloud in a high-density shared environment. And within minutes you can go to 10 Large instances reserved only for you! And this is huge this is 40 CPU cores with total of 70GB of RAM! Just for your web site. I would say you will need to reengineer your site, before going that big. So what are the secrets?

Project KUDU

What very few know or realize, that Windows Azure Websites runs Project KUDU, which is publicly available on GitHub. Yes, thats right, Microsoft has released Project KUDU as open source project so we can all peek inside, learn, even submit patches if we find something is wrong.

Deployment Credentials

There are multiple ways to deploy your site to Windows Azure Web Sites. Starting from plain old FTP, going through Microsofts Web Deploy and stopping at automated deployment from popular source code repositories like GitHub, Visual Studio Online (former TFS Online), DropBox, BitBucket, Local Git repo and even External provider that supports GIT or MERCURIAL source control systems. And this all thanks to the KUDU project. As we know, Windows Azure Management portal is protected by (very recently) Windows Azure Active Directory, and most of us use their Microsoft Accounts to log-in (formerly known as Windows Live ID). Well, GitHub, FTP, Web Deploy, etc., they know nothing about Live ID. So, in order to deploy a site, we actually need a deployment credentials. There are two sets of Deployment Credentials. User Level deployment credentials are bout to our personal Live ID, we set user name and password, and these are valid for all web sites and subscription the Live ID has access to. Site Level deployment credentials are auto generated and are bound to a particular site. You can learn more about Deployment credentials on the WIKI page.

KUDU console

Im sure very few of you knew about the live streaming logs feature and the development console in Windows Azure Web Sites. And yet it is there. For every site we create, we got a domain name like

http://mygreatsite.azurewebsites.net/

And behind each site, there is automatically created one additional mapping:

https://mygreatsite.scm.azurewebsites.net/

Which currently looks like this:

Key and very important fact this console runs under HTTPS and is protected by your deployment credentials! This is KUDU! Now you see, there are couple of menu items like Environment, Debug Console, Diagnostics Dump, Log Stream. The titles are pretty much self explanatory. I highly recommend that you jump on and play around, you will be amazed! Here for example is a screenshot of Debug Console:

Nice! This is a command prompt that runs on your Web Site. It has the security context of your web site so pretty restricted. But, it also has PowerShell! Yes, it does. But in its alpha version, you can only execute commands which do not require user input. Still something!

Log Stream

The last item in the menu of your KUDU magic is Streaming Logs:

Here you can watch in real time, all the logging of your web site. OK, not all. But everything youve sent to System.Diagnostics.Trace.WriteLine(string message) will come here. Not the IIS logs, your applications logs.

Web Site Extensions

This big thing, which I described in my previous post, is all developed using KUDU Site Extensions it is an Extension! And, if you played around with, you might already have noticed that it actually runs under

https://mygreatsite.scm.azurewebsites.net/dev/wwwroot/

So what are web site Extensions? In short these are small (web) apps you can write and you can install them as part of your deployment. They will run under separate restricted area of your web site and will be protected by deployment credentials behind HTTPS encrypted traffic. you can learn more by visiting the Web Site Extensions WIKI page on the KUDU.

My Android MiniPC and TVBoxes site is an example of a Windows Azure Web Site.


Return to section navigation list

Windows Azure Cloud Services, Caching, APIs, Tools and Test Harnesses

No significant articles so far this week.


Return to section navigation list

Windows Azure Infrastructure and DevOps

DH Kass asked Windows Azure, Office 365, Bing and Xbox Live to gain more processing power? in a deck for his Microsoft Cloud Bet?: $11M Land Deal for Giant Data Center article of 12/30/2013 for the VarGuy blog:

Microsoft (MSFT) has plunked down $11 million to buy 200 acres of industrial land in rural Port of Quincy, WA to build a second data center there. Is this another huge bet on Windows Azure, Office 365, Xbox Live and other cloud services? Perhaps.

NY Times/Kyle Bair/Bair Aerial

The new facility will triple the size of Microsoft's current 470,000 square foot server plant. The vendor has some history in Quincy--first setting up a data center there in 2007 on 75 acres of agricultural land, attracted by cheap electrical power fed by hydroelectric generators drawing water from the nearby Columbia River--and three years later signaling its intention to expand its facilities in the area.

According to a report in the Seattle Times, the latest deal is said to be among the largest in Quincys history, with Microsoft paying $4 million for 60 acres of land already owned by the city and another $7 million for 142 acres of neighboring parcels Quincy will buy from private owners and sell back to the IT giant.

The transaction is expected to close in late January, 2014, followed by ground-breaking on the project in the spring with completion expected in early 2015. Microsoft said the data center will employ about 100 people.

Big Data Centers

The resulting facility is expected to be a colossus among giants. Quincy also hosts data centers belonging to IT heavyweights Dell, Intuit (INTU) and Yahoo (YHOO), and wholesalers Sabey and Vantage. Agriculture and food processing giants ConAgra Foods, National Frozen Foods, NORPAC, Columbia Colstor, Oneonta, Stemilt, CMI and Jones Produce also have data centers there.

In addition, direct sales giant Amway is slated to open a $38 million, 48,000 square foot botanical concentrate manufacturing facility in Quincy in May, 2014 to replace an existing plant near Los Angeles.

In moves to capture ballooning demand for connectivity in eastern Washington, network operator and data center services provider Level 3 (LVLT) has expanded its network's fiber backbone in the area, installing more than 200 miles of fiber cable and connections in Quincy alone. In addition, Level 3 also is connecting customers to regional incumbent local exchange carriers.

The Quincy data center project isnt Microsofts sole server farm expansion. A Neowin report said the vendor also has offered a deal to The Netherlands to set up a $2.7 billion data center in the Noord-Holland province.

Will this data center be designated NorthWest US for Windows Azure?

Read the rest of the article here.


Nick Harris (@cloudnick) and Chris Risner (@chrisrisner) produced Cloud Cover Episode 124: Using Brewmaster to Automate Deployments on 12/20/2013:

In this episode Chris Risner and Nick Harris are joined by Ryan Dunn, the founder of Cloud Cover and now Director of Product Services at Aditi. In this video, Ryan talks about Project Brewmaster. Brewmaster is a free tool which enables easy and automated deployment of a number of different configurations to Windows Azure. Some of the things Brewmaster makes it easy to deploy include:

Load-balanced webfarms Highly available and AlwaysOn SQL Server SharePoint ARR Reverse Proxy Active Directory in multiple data centers

All of the deployments are template based which will eventually allow further customization of the deployment as well as creation of new templates.


Return to section navigation list

Windows Azure Pack, Hosting, Hyper-V and Private/Hybrid Clouds

No significant articles so far this week.


Return to section navigation list

Visual Studio LightSwitch and Entity Framework 4.1+

Rowan Miller reported EF 6.1 Alpha 1 Available in a 12/20/2013 post to the ADO.NET Blog:

Since the release of EF6 a couple of months ago our team has started working on the EF6.1 release. This is our next release that will include new features.

Whats in Alpha 1

Its still early days in the 6.1 release, so there arent many new features to try out just yet. Most of our work so far has been fixing bugs, improving performance, and laying groundwork for new features.

Runtime

The following items are included in Alpha 1 runtime

Handling of transaction commit failures provides the ability to detect and recover automatically when transient connection failures affect the acknowledgement of transaction commits. You can read more about this feature in the specification on our CodePlex site. EntityFramework.SqlServerCompact.Legacy is a new NuGet package (contributed by ErikEJ) that allows you to use Entity Framework 6 to target SQL Compact 3.5. Bug fixes, minor features, and performance improvements included in the Alpha 1 release can be viewed on our CodePlex site. Tooling

Our 6.1 tooling code base isnt ready for preview yet. Well be including the tooling in future previews of EF6.1.

Where do I get Alpha 1?

The runtime is available on NuGet. If you are using Code First then there is no need to install the tooling. Follow the instructions on our Get It page for installing the latest pre-release version of Entity Framework runtime.

The tooling isnt ready for preview yet, but you can use the EF6.1 Alpha 1 runtime with the existing EF6 Tooling for Visual Studio 2012 and 2013.

Support

This is a preview of features that will be available in the final release of EF6.1 and is designed to allow you to try out the new features and report any issues you encounter. Microsoft does not guarantee any level of support on this release.

If you need assistance using the new features, please post questions on Stack Overflow using the entity-framework tag.

Whats next?

We still have plenty of bugs to fix and some new features to add before we release EF6.1. There will be more previews over the coming months including previews of the EF6.1 Tooling.


Beth Massi (@bethmassi) posted Beginning LightSwitch in VS 2013 Part 5: May I? Controlling Access with User Permissions on 12/18/2013:

NOTE: This is the Visual Studio 2013 update of the popular Beginning LightSwitch article series. For previous versions see:

Visual Studio 2012: Part 4: Too much information! Sorting and Filtering Data with Queries Visual Studio 2010: Part 4: Too much information! Sorting and Filtering Data with Queries

Welcome to Part 5 of the Beginning LightSwitch in Visual Studio 2013 series! In part 1- 4 we learned about entities, relationships, screens and queries in Visual Studio LightSwitch. If you missed them:

Part 1: Whats in a Table? Describing Your Data Part 2: Feel the Love - Defining Data Relationships Part 3: Screen Templates, Which One Do I Choose? Part 4: Too much information! Sorting and Filtering Data with Queries

In this post I want to talk about user permissions, also known as Access Control. In most business applications we need to limit what resources users can access in the system, usually because of different job function or role. For instance, only system administrators can add new users to the system. Certain data in the application may be sensitive and should be restricted unless that user has rights to that data. LightSwitch makes it easy to define user permissions and provides hooks on entities and queries that allow you to check these permissions.

For a video demonstration on how to set up user permissions see: How Do I: Set Up Security to Control User Access to Parts of a Visual Studio LightSwitch Application?

Authentication Authorization

There are two pieces of information LightSwitch applications need in order to determine which users have rights to what parts of the system. First, the system needs to verify the user accessing the application. This is called Authentication. In other words: Prove you are who you say you are. There are two supported types of authentication in LightSwitch; Windows and Forms.

Windows authentication means that the application trusts the user based on their Windows credentials. So once a user successfully logs into their Windows desktop, those credentials are automatically passed to the LightSwitch application. Forms authentication means that the application requests a username password of its own, completely independent of any other credentials. So when you choose to use Forms authentication a login screen is presented to the user and they must type their username and password every time they want to access the application.

Once a user is authenticated, the application can determine access to parts of the system by reading their user permissions. This is called Authorization. In other words: Now that I know who you are, heres what you can do in the system.

LightSwitch uses the ASP.NET membership provider model so you can also incorporate your own custom membership provider. For more information see Customizing LightSwitch User Management.

Setting Up User Permissions

It all starts on the Access Control tab of the Project Properties. To open it, double-click on the Properties node under the main project in the Solution Explorer.

Then select the Access Control tab to specify the type of authentication you want to employ as well as what user permissions you want to define.

By default, the application doesnt have authentication enabled so here is where you select the type of authentication you want to use.

Using Forms authentication means you will be storing usernames and encrypted passwords inside the LightSwitch database. This type of authentication is appropriate for internet-based applications where users are not on the same network and you need to support other operating systems besides Windows. If you are deploying your application for mobile users or to an ISP or Azure website or cloud service, then youll want to choose Forms auth.

Using Windows authentication is appropriate if all your users are on the same network/domain or workgroup, like in the case of an internal line-of-business application. This means that no passwords are stored by your LightSwitch application. Instead the Windows logon credentials are used and passed automatically to the application. In this case you can also choose whether you want to set up specific users and roles or whether any authenticated user has access to the application.

The best way to think of the two options for Windows authentication are:

Give special permissions and roles to the Windows users or Active Directory groups that I administer within the application. (This is always on if you have selected Windows authentication) ALSO, let any Windows user access the unprotected parts of my application

Next you define user permissions that you check in code in order to access resources (well work through an example next). There is always a SecurityAdministration permission defined for you that is used by LightSwitch once you deploy the application. When you deploy, LightSwitch will create a single user with this permission which gives them access to the screens necessary to define the rest of the users and roles in the system. However, while debugging your application, LightSwitch doesnt make you log in because this would be tedious to do every time you built and ran (F5) the application. So instead you can use the Granted for debug checkbox to indicate which sets of permissions should be turned on/off in the debug session.

Note: If you have enabled SharePoint in your LightSwitch application then the user management is handled by SharePoint. You can check the current users SharePoint profile information, including their department and role, by using the properties of the Application.User object. For more information see Using the Person Business Type

Lets walk through a concrete example by implementing some security in our Address Book (Contact Manager) application weve been building in this series.

Checking User Permissions in the Address Book Application

Lets start by selecting an authentication scheme. For this example, Ill select Use forms authentication so that everyone that has access to the application can search for and edit contacts from any external network or internet. However, in order to add or delete contacts, users will need special permissions to do that.

So we need to create two new permissions. You can name the permissions whatever you want. You only see the name in code. When the system administrator sets up users and roles later, they will see the Display Name on the screen so be descriptive there. So add two permissions; CanAddContacts and CanDeleteContacts.

Next, leave the Granted for debug unchecked for both of those permissions so that we can test that they are working. When you leave this unchecked, the permission will not be granted. This allows us to easily test combinations of permissions while debugging. Now that we have these permissions set up here, we need to check them in code. As I mentioned, LightSwitch provides method hooks for you so you can write code when you need for all sorts of custom validation and business rules, including access control.

For more information on writing code in LightSwitch see the Performing Data-Related Tasks by Using Code topic in the library.

For more information on writing custom business rules see: Common Validation Rules in LightSwitch Business Applications

So in order to implement the security, we need to write a couple lines of code to check these permissions. LightSwitch provides access control methods on entities and queries and are executed on the server so that your data is protected no matter what client is hitting your middle-tier. When you want to restrict viewing (reading), inserting (adding), editing or deleting entities, open the entity in the Data Designer and drop down the Write code button and select the appropriate access control method.

For this application, select the Contacts_CanDelete method and this will open the code editor to that method stub. All you need to do is write one line of code (in bold below) to check the CanDeleteContacts permission you set up:

VB:

Namespace LightSwitchApplication 
Public Class ApplicationDataService
Private Sub Contacts_CanDelete(ByRef result As Boolean)
'Add this one line of code to verify the user has permission to delete contacts:
result = Me.Application.User.HasPermission(Permissions.CanDeleteContacts)
End Sub
End Class
End Namespace

C#:

namespace LightSwitchApplication 
{
public partial class ApplicationDataService
{
partial void Contacts_CanDelete(ref bool result)
{
//Add this one line of code to verify the user has permission to delete contacts:
result = this.Application.User.HasPermission(Permissions.CanDeleteContacts);
}
}
}

Now go back to the Write Code button on the designer and select Contacts_CanInsert and then similarly write the following line of code (in bold) to check the CanAddContacts permission. Your code file should look like this.

VB:

Namespace LightSwitchApplication    Public Class ApplicationDataService 
Private Sub Contacts_CanDelete(ByRef result As Boolean) 'Add this one line of code to verify the user has permission to delete contacts:
result = Me.Application.User.HasPermission(Permissions.CanDeleteContacts) End Sub
Private Sub Contacts_CanInsert(ByRef result As Boolean) 'Add this one line of code to verify the user has permission to add contacts:
result = Me.Application.User.HasPermission(Permissions.CanAddContacts) End Sub
End Class
End Namespace

C#:

using System; 
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Microsoft.LightSwitch;
using Microsoft.LightSwitch.Security.Server;
namespace LightSwitchApplication
{
public partial class ApplicationDataService
{
partial void Contacts_CanDelete(ref bool result)
{
//Add this one line of code to verify the user has permission to delete contacts:
result = this.Application.User.HasPermission(Permissions.CanDeleteContacts);
}
partial void Contacts_CanInsert(ref bool result)
{
//Add this one line of code to verify the user has permission to delete contacts:
result = this.Application.User.HasPermission(Permissions.CanAddContacts);
}
}
}

You may be wondering why we are checking these permissions in the entity instead of the screens. Checking permissions in the entity guarantees that no matter what screen the user is working with, the data actions are protected. You need to remember to secure your entities on the server if you need to implement user permissions in your application. However, if you need to hide/disable UI elements on your HTML screens based on user permissions there are a couple options. See:

LightSwitch Tip: A Simple Way to Check User Permissions from the HTML Client Using LightSwitch ServerApplicationContext and WebAPI to Get User Permissions Run it!

Now we are ready to test the application so build and run by hitting F5. Because we didnt grant the CanAddContacts permission for debug, if you try to add a contact and save the data an error message will be displayed.

If you go back to your project properties Access Control tab you can check off combinations of permissions so you can test them at debug time.

Users Roles Screens

In order to set up users and roles in the system, the application needs to have an administration console and be deployed. When your application is deployed the first time, LightSwitch will ask you for an administrator username password that it deploys into the users table and grants the SecurityAdministration permission. That administrator can then enter the rest of the users into the system.

For more information see: How to Assign Users, Roles and Permissions to a LightSwitch HTML Mobile Client

Wrap Up

As you can see defining and checking user permissions in Visual Studio LightSwitch is a simple but important task. Access control is a very common requirement in professional business applications and LightSwitch provides an easy to use framework for locking down all parts of the application through access control method hooks. Once you deploy your application, the system administrator can start setting up users and roles to control access to the secure parts of your application.

For more information on user permissions and deploying applications see Working with User Permissions and Deploying LightSwitch Applications topics on the LightSwitch Developer Center.

In the next post well look at customizing the HTML client with some JavaScript and CSS. Until next time!

Enjoy!


Return to section navigation list

Cloud Security, Compliance and Governance

No significant articles so far this week.


Return to section navigation list

Cloud Computing Events

No significant articles so far this week.


Return to section navigation list

Other Cloud Computing Platforms and ServicesJeff Barr (@jeffbarr) reported Amazon EC2's New I2 Instance Type - Available Now! on 12/20/2013:

Late last month I gave you a sneak peak at our newest EC2 instance type, the I2. These instance types are available today, in four sizes across seven AWS regions.

The I2 instance type was designed to host I/O intensive workloads typically generated by relational databases, NoSQL databases, and transactional systems. The largest I2 instance type can deliver over 365K random reads per second and over 315K random writes per second, both measured with a block size of 4 KB. With four instance sizes, you can start small and scale up as your storage and I/O needs grow.

This is our second generation High I/O instance type, picking up where the HI1 instance left off. In comparison to the HI1 instance type, members of the I2 family offer faster processors, three additional instance sizes, a doubling of the amount of memory per vCPU, and 56% more SSD-based instance storage.

The Specs
Here are the instance sizes and the associated specs:

The prices shown above are for On-Demand instances in the US East (Northern Virginia) and US West (Oregon) regions; see the EC2 pricing page for full information.

The instances are available in On-Demand and Reserved form in the US East (Northern Virginia), US West (Oregon), US West (Northern California), EU (Ireland), Asia Pacific (Singapore), Asia Pacific (Tokyo), and Asia Pacific (Sydney) regions.

I2 instances support Hardware Virtualization (HVM) AMIs only. In order to obtain the best I/O performance from these instances, you should use the Amazon Linux AMI 2013.09.02 or any Linux AMI with a version 3.8 or newer kernel. Older versions of the kernel will exhibit lower I/O performance when used with I2 instances.

CPU Power
Each vCPU (Virtual CPU) is a hardware hyperthread on an Intel E5-2670 v2 (Ivy Bridge) processor. The processor supports the AVX (Advanced Vector Extensions), along with Turbo Boost and NUMA.

NUMA (Non-Uniform Memory Access) speeds access to main memory by optimizing for workloads where the majority of requests for a particular block of memory come from a single processor. By enabling processor affinity (asking the scheduler to tie a particular thread to one of the processors) and taking care to manage memory allocation according to prescribed rules, substantial performance gains are possible.

Enhanced Networking
All four sizes of the I2 instance type benefit from our new Enhanced Networking feature. When you launch these instances inside of a Virtual Private Cloud (VPC), you will enjoy low latency, low jitter, and the ability to move a very large number of packets per second (PPS). In order to take advantage of this important feature, you will need to use an HVM AMI with the proper drivers installed (read our documentation on Enabling Enhanced Networking to learn more).

The three smallest instance types also support EBS Optimization, with dedicated network throughput from the instance to Amazon EBS.

SSD Storage
As you can see from the table above, the I2 instances include a copious amount of SSD storage, ranging from 800 gigabytes on the i2.xlarge all the way up to 6.4 terabytes on the i2.8xlarge.

The SSD storage now supports TRIM functionality, which improves performance when the SSD handles a series of successive write operations.

Go For Launch
As I mentioned earlier, these instance types are available now in seven AWS regions and you can start to use them right now.


Colin Su posted the slide deck for A Tour of the Google Cloud Platform on 12/20/2013:

Other Google Cloud slide decks by Colin Su: Introduction to Google App Engine Introduction to Hadoop and MapReduce

Jeff Barr (@jeffbarr) described AWS Direct Connect - Access to Multiple US Regions on 12/19/2013:

AWS Direct Connect makes it easy to establish a dedicated network connection from your premises to AWS. Our customers use Direct Connect to reduce their network costs, increase throughput, and provide a more consistent network experience than Internet-based connections.

Connect Now
Effective immediately, you can provision a single connection to any Direct Connect location in the United States and use it to access all four of the AWS Regions in the US (US East (Northern Virginia), US West (Northern California), US West (Oregon), and AWS GovCloud (US)). Data transferred between Regions flows over network infrastructure maintained by Amazon and does not flow across the public Internet.

If you have already used Direct Connect to create a dedicated connection, the new routing is already in effect. Our networking infrastructure now announces routes to the connection via the usual BGP announcements.

What You Get
As a Direct Connect user, you will see a number of benefits from this change.

Cost Savings - One connection, to any AWS Region in the US, can potentially take the place of up to four existing connections. You will pay less for your network circuits and for Direct Connect, and the per GB data transfer cost is also lower.

Improved AWS Access - Your on-premises applications can now connect to the public endpoints of AWS services running in any of the AWS Regions in the US.

Enhanced Data Protection - Data transferred between the application and AWS will not flow across the public Internet.

Pricing
This new feature is included in the cost of Direct Connect. You pay only for data transfer from the remote Regions to your Direct Connect Region. This data transfer is billed at the rate of $0.03/GB.


Return to section navigation list

Technorati Tags: Windows Azure,Windows Azure Platform,Windows Azure Cloud Services,Windows Azure Storage Services,Windows Azure Service Bus,Windows Azure BizTalk Services,WABS,Windows Azure Access Control Services,Windows Azure Virtual Machines,WAVM,Windows Azure Virtual Networks,WAVN,Windows Azure Active Directory,WAAD,Windows Azure SQL Database,WASB,SQL Database Federations,Window Azure Mobile Services,Mobile Services,WAMS,Open Data Protocol,OData,Cloud Computing,Visual Studio LightSwitch,LightSwitch,Amazon Web Services,Google Cloud Services

Older PostsHomeSubscribe to:Posts (Atom)Forbes: Who Are The Top 20 Influencers in Big Data?Haydn Shaughnessy (@haydn1701) ranked me seventh in his Who Are The Top 20 Influencers in Big Data? article of 2/3/2012 for Forbes Magazine. Thanks, Haydn.
TRAACKR: Who Are The Top 50 Data Science Influencers?Gil Press (@GilPress) ranked me #47 when he added me to TRAACKR's A-List of 50 Top "Data Science" Influencers on 7/1/2012. Thanks, Gil.
Blog Archive 2014(4) January(4)Executing a Service Provider Licensing Agreement (...Uptime Report for My Live Windows Azure Web Site: ...Uptime Report for my Live OakLeaf Systems Azure Ta...Microsofts Project Siena Needs a Windows Azure Mo... 2013(100) December(8) November(7) October(9) September(10) August(7) July(7) June(10) May(4) April(11) March(8) February(8) January(11) 2012(191) December(17) November(11) October(10) September(15) August(13) July(18) June(17) May(17) April(21) March(18) February(14) January(20) 2011(311) December(23) November(23) October(31) September(26) August(30) July(29) June(32) May(20) April(16) March(23) February(24) January(34) 2010(300) December(30) November(31) October(20) September(22) August(26) July(26) June(24) May(31) April(27) March(22) February(20) January(21) 2009(191) December(18) November(14) October(24) September(14) August(13) July(14) June(10) May(13) April(12) March(17) February(21) January(21) 2008(197) December(13) November(20) October(18) September(18) August(22) July(23) June(21) May(22) April(7) March(12) February(7) January(14) 2007(187) December(14) November(10) October(15) September(13) August(13) July(14) June(15) May(19) April(18) March(31) February(16) January(9) 2006(73) December(14) November(5) October(2) September(3) August(7) July(3) June(8) May(12) April(6) March(8) February(3) January(2) 2005(59) December(5) November(11) October(11) September(17) August(1) July(3) June(2) May(3) April(1) March(5)OakLeaf Blog Curations on Curah!Windows Azure and Cloud Computing Article CompendiumAndroid MiniPCs and TVBoxes Are the Price Leaders in the BYOD MarketLinks to SQL Azure Labs and Other Big Data ArticlesMy Recent Articles about SQL Azure Labs and Other Added-Value Windows Azure SaaS Previews: A Bibliography post of 11/1/2012 contains brief descriptions and links to OakLeaf Systems' articles about Microsoft Codename "Cloud Numerics," "Social Analytics," "Data Hub," "Data Transfer," "Data Explorer," "Project Austin" (StreamInsight Services for Windows Azure), HDInsight, and Windows Azure Mobile Services.
Windows Azure Mobile Services Preview Walkthrough for Windows Store Apps1: Windows 8 ToDo Demo Application2: Authenticating Windows 8 App Users3: Pushing Notifications to Windows 8 Users4: Customizing the Windows Store Apps UI5: Distributing Your App from the Windows StoreOakLeaf's New Windows Azure WebSitesThe contents of the lengthy First Look at the UG007 and Other Android 4.1+ MiniPC Devices post have been split and copied to three shorter posts in a new Android MiniPCs and TVBoxes WordPress blog running as a shared Windows Azure WebSite.

A WordPress clone of the OakLeaf Systems blog running on a free Windows Azure Web Site (formerly codenamed "Antares") demonstrates this new feature from 2012's "Spring Wave" of upgrades and updates to Windows Azure and its components.
Check Out OakLeaf's New Azure DataMarket OfferingsThe new Accessing
US Air Carrier Flight Delay DataSets on Windows Azure Marketplace DataMarket and
DataHub describes how to subscribe to these two free SQL Azure dataset offerings, filter them with the Explorer's Query Builder and use LINQPad to execute more complex OData URL queries.
SearchCloudComputing Articles
Click the above image and scroll down to read my cloud computing, Windows Azure and big data articles on TechTarget's SearchCloudComputing.com blog.See also my Links
to My Cloud Computing Tips at TechTargets SearchCloudComputing and Other
Search Sites post for a complete list of links.
Articles for Red Gate Software's ACloudyPlace
Click the above image to read a list of my Windows Azure, SQL Azure, cloud computing and big data articles for this new blog.See also my Links
to My Cloud Computing Articles at Red Gate Softwares ACloudyPlace Blog post for a complete list of links.
Windows Azure Articles for Developer.comI'm an occasional contributor to the Cloud Computing section of Internet.com's Developer.com web site. Links to my articles are here.
DZone Syndication
DZone syndicates selected posts from the OakLeaf Systems blog in its CloudZone section. Click the badge above for links to syndicated articles. CloudZone is sponsored by Microsoft.Feeds

Atom 1.0 site feed

RSS 2.0 site feed

Follow me on:

Twitter: rogerjenn Tweet

Linked In: Roger Jennings

Facebook: Roger Jennings

Google+

Friendfeed: RogerJennings

Pinterest: rogerjenn

FUSELABS so.cl

My SharePoint Online Site and Blog

My Office 365 Preview Developer Site

My SharePoint 2010 (Office 365) Site

My Microsoft FUSELABS Montage Page

My Amazon Author Page

My GigaOM Pro Analyst Page

OakLeaf Systems' Listings in Microsoft PinpointOakLeaf Public Data Hub: US Air Carrier Flight Delay DataSet is a new Pinpoint listing as of 5/22/2012.

OakLeaf Systems' other listings in the Microsoft Pinpoint registry include links to our live OakLeaf Systems Azure Table Services Sample Project, SQL Azure Reporting Services Preview Sample, Codename "Social Analytics" WinForms Sample and Codename "Data Explorer" mashup.


Links to Cover Stories for Visual Studio MagazineLinks to Eight Years of My Visual Studio Magazine Cover Stories (November 2003 to June 2011)
The Windows Azure Team Interview, 11/30/2010Robert Duffner posted Thought Leaders in the Cloud: Talking with Roger Jennings, Windows Azure Pioneer and Owner of OakLeaf Systems to the Windows Azure Team blog on 11/30/2010.
OakLeaf Blog Ranked #134 of Influential Data Blogs by eCairnSee Case study with Data blogs, from 300 to 1000.

About eCairn:

eCairn Inc. is a privately held software technology company, founded in October 2006. We specialize in community and influencers marketing and differentiate in who matters.OakLeaf Systems' Windows Azure Table Services Sample ProjectTest drive GridView paging and iterative operations on Northwind Customer entities in Windows Azure Table Storage with the OakLeaf Systems Azure Table Services Sample Project demo from my Cloud Computing with the Windows Azure Platform book.

The dual Web role application has been running in Microsoft's South Central US (San Antonio) data center since September 2009. I believe it is the oldest continuously running Windows Azure application.
Check Out the OakLeaf SharePoint Online SiteThe new Oakeaf Systems SharePoint Online Web site demonstrates SharePoint 2010 features delivered by Office 365's SharePoint Online. My first test of the beta version was delivering Access Services as the back-end for a Microsoft Access 2010 Web database in a 5/23/2011 Webcast: Moving Access Tables to SharePoint 2010 or SharePoint Online Lists.Google Blog Search

List of 2,000+ original (topical) OakLeaf posts since 1/1/2009 in reverse date order

Search for 'Windows Azure'

Search for 'SQL Azure'

Search for 'OData'

Search for 'LightSwitch'

Windows Azure Questions Answers

Azure + Windows-Azure Questions

Windows-Azure-Storage Questions

SQL-Azure Questions

Windows Azure Forum

SQL Azure - Getting Started Forum

Windows Azure AppFabric Forum

Windows Azure Webcasts on Channel9

Windows Azure Management Portal

SQL Azure Labs: Incubator SaaS Projects

Windows Azure Platform AppFabric Labs

Windows Azure Apps on Microsoft Pinpoint

About MeRoger Jennings (--rj)Oakland, California, United StatesI'm a Windows Azure Insider, a retired Windows Azure MVP, the principal developer for OakLeaf Systems and the author of 30+ books on Microsoft software. The books have more than 1.25 million English copies in print and have been translated into 20+ languages.

My latest books are Cloud Computing with the Windows Azure Platform for Wrox and Microsoft Access 2010 In Depth for Que.

I'm also a contributing editor for Visual Studio Magazine, write for Redmond Developer News, and contribute to TechTarget's SearchCloudComputing.com blog.

Full disclosure: I make part of my livelihood by writing about Microsoft products in books and for magazines. I regularly receive free evaluation software from Microsoft and press credentials for Microsoft TechEd and PDC. I'm also a member of the Microsoft Partner Network.View my complete profileMy Latest BooksCloud Computing with the Windows Azure Platform published on 9/21/2009. Click image for Wrox's book site:



I've updated all 1,500 pages of Special Edition Using Microsoft Office Access 2007 to Microsoft Office Access 2010 In Depth for QUE Publishing. This 12th edition with a new series title published on 12/18/2010 and became available for purchase on and after 1/1/2011.





Early MiniDV and FireWire PostsDV vs. Betacam SP: 4:1:1 vs. 4:2:2, Artifacts and Other ControversiesFire on the Wire: The IEEE 1394 High Performance Serial BusLabels SQL AzureNoSQLAstoriaAzureAzure AppFabricCloud ComputingWindows AzureADO.NET Data ServicesSQL Azure Data MigrationWindows Azure AppFabricVMware(1).NET 3.5(2).NET 3.5 SP1(1).NET 4.5(1).NET Framework(1).NET Framework 3.5(1).NET Framework 4.0(3).NET Services(32)1105 Media(6)3Tera(1)Access 2007(7)Access 2010(20)Access 2013(1)Access Control Services(15)Access Data Projects(4)Access Services(3)Access Web Apps(1)Active Directory(5)Active Directory Authentication Library(2)Active Directory Federation Services(2)Active Directory Rights Management Services(1)ActiveRecord(1)Actor Framework(1)ActorFx(1)ADAL(1)ADFS(2)ADO.NET(14)ADO.NET 3.0(8)ADO.NET 3.5(3)ADO.NET 4.0(4)ADO.NET Data Services(470)ADO.NET Entity Framework v4.2(3)ADO.NET Synchronization Services(15)ADO.NET vNext(17)AdventureWorks(1)Aggregate Functions(2)Ajax(9)Amalga(4)Amazon(55)Amazon CCI(3)Amazon Cloud Drive(2)Amazon CloudFormations(6)Amazon CloudFront(16)Amazon CloudSearch(1)Amazon Connect(3)Amazon Data Pipeline(1)Amazon DynamoDB(14)Amazon EC2(107)Amazon Elastic Beanstock(20)Amazon Elastic MapReduce(4)Amazon Elastic Transcoder(1)Amazon ElastiCache(6)Amazon ELB(4)Amazon Glacier(2)Amazon GPU Clusters(4)Amazon IAM(2)Amazon Kindle(2)Amazon Kinesis(1)Amazon MFA(3)Amazon RDS(20)Amazon Redshift(2)Amazon Route 53(4)Amazon S3(55)Amazon S3 RRS(1)Amazon SES(2)Amazon Simple Email Service(2)Amazon SimpleDB(19)Amazon SNS(6)Amazon SQS(8)Amazon SWF(1)Amazon VPC(6)Amazon VPN(5)Amazon Web Services(248)Amazon WebServius(10)Amazon Work Spaces(1)Anders Hejlsberg(23)Android(14)Apache Drill(1)Apache Foundation(1)Apache Hadoop(26)Apache Hadoop on Windows Azure(32)Apache HCatalog(1)Apache Hive(10)Apache MapReduce(4)Apache Pig(6)Apache Yarn(3)App Controller(5)App Harbor(3)AppFabric(60)AppFabric Applications(2)AppFabric Labs(2)Appistry(2)Apprenda(2)ARRA(5)Array Initializers(1)ASP.NET(17)ASP.NET 3.5 Extensions(6)ASP.NET Dynamic Data(65)ASP.NET Dynamic Data Extensions(4)ASP.NET Membership Services(1)ASP.NET MVC(32)ASP.NET Web API(1)Astoria(452)Atlas(1)Atom 1.0(12)AtomPub(22)ATT DSL(1)AWS Marketplace(1)AWS Storage Gateway(2)AWSCloudHMS(1)Azure(807)Azure Accelerator for Web Roles(4)Azure AppFabric(320)Azure AppFabric Application Manager(6)Azure AppFabric Caching(8)Azure AppFabric SDK(5)Azure Blob Storage Services(216)Azure CDN(16)Azure Cloud Services(14)Azure Compute Services(38)Azure Connect(23)Azure Data Services(223)Azure Developer Portal(3)Azure Diagnostics(28)Azure Drive(8)Azure Media Services(15)Azure Mobile Services(41)Azure Notification Hub(4)Azure Pack(7)Azure Platform Appliance(26)Azure Queue Storage Services(60)Azure SDK 1.5(1)Azure Service Bus(28)Azure Storage Analytics(12)Azure Table Storage Services(195)Azure Table Storage Updates(2)Azure VM Roles(22)Azure VMs(41)Azure VPNs(1)Azure Web Roles(16)Azure Web Sites(38)Azure+(1)Bang Operator(1)BCS(1)Big Data(44)BigQuery(5)BigTable(11)Bindable LINQ(3)Bing(2)BizSpark(4)BizTalk(9)Blinq(9)Blob Storage Services(7)Blog Editors(1)Blogger(10)Blogger Mobile Templates(1)Blue Jet(1)BPOS(3)Bugs(2)BUILD 2012(4)BUILD Conference(12)Business Analytics(2)Business Connectivity Services(1)Business Intelligence(7)C# 3.0(35)C# 4.0(3)C# Express(1)CAML(1)CAP Theorem(1)CardSpace(5)Cassandra(14)Categories(1)Cayman 3220-H Router(1)CEP(1)Cerebrata(6)Chef(2)Chrome OS(1)Cisco(9)Citrix(7)Clark Foam(1)ClearDB(3)CLINQ(4)Closures(1)Cloud Backup Services(2)Cloud Computing(878)Cloud Computing Futures (CCF)(4)Cloud Computing Performance(4)Cloud Computing Security(26)Cloud Essentials Pack(3)Cloud Foundry(3)Cloud Identity Summit(1)Cloud Ninja Project(2)Cloud Security Alliance(1)Cloud Service Pricing(15)CloudBees(1)CloudBerry(1)Cloudera(7)CloudKick(1)Cloudscaling(2)Cloudwashing(1)Clustrix(1)Codename Antares(3)Codename Atlanta(2)Codename Austin(3)Codename Cloud Numerics(26)Codename Dallas(38)Codename Data Explorer(32)Codename Data Hub(17)Codename Data Transfer(18)Codename Daytona(10)Codename Denali(1)Codename Houston(2)Codename Juneau(1)Codename Social Analytics(37)Codename Trust Services(4)Collection Initializers(1)Common Criteria(1)Complex Event Processing(1)Concero(8)Concurrency Management(2)Continuous LINQ(4)Cosmos(1)CouchDB(8)Cringely(1)CRM(1)Crowbar(2)Cumulux(1)da Vinci Toolset(2)DabbleDB(4)DAO(1)Data Centers(5)Data Connections(2)Data Entry(1)Data Explorer(6)Data Mining(7)Data Services(7)Database Mail(2)Database.com(4)Databases(10)Databinding(4)DataContract(2)DataContractJsonSerializer(1)DataContractSerializer(6)DataDude(1)datajs(1)DataMarket(20)DataSift(1)DB2(3)DB2 9 Express-C(1)Debugging(1)Deep Zoom(1)DefiningView(1)Dell Computer(9)Democratizing the Cloud(3)Dependency Injection/Inversion(1)DevOps(25)DISH Network(2)DISH VIP 722K DVR(2)Django(1)DLinq(27)DLR(3)Dot Cloud(1)Downtime Reports(6)Dreamforce(3)Dremel(1)DropBox(1)Drupal(9)Dryad(19)DryadLINQ(20)DSL problems(1)Dublin(2)Dynamic ADO.NET(2)Dynamic Language Runtime(2)Dynamic LINQ(3)Dynamics AX 2012(2)Dynamics AX code-named 6(1)Dynamics CRM(13)Dynamics NAV(4)eBay(6)Eclipse(3)ecto(1)EDM(94)EDM Designer(48)EF Extensions(7)EF v1 Manifesto(8)EHR(20)Electric MapReduce(1)Electronic Health Records(21)Electronic Medial Record(13)EMC(1)EMR(17)Encryption(4)Endpoint Protection(1)Entity Data Model(160)Entity Data Model Designer(40)Entity Data Platform(14)Entity Framework(196)Entity Framework August 2007 CTP(2)Entity Framework Beta 2(9)Entity Framework Beta 3(14)Entity Framework June 2007 CTP(3)Entity Framework SP1(14)Entity Framework SP1 Beta(4)Entity Framework v2(36)Entity Framework v4(33)Entity Framework v4.1(16)Entity Framework v4.2(6)Entity Framework v4.3(6)Entity Framework v5(13)Entity Framework v6(5)Entity SQL(62)EntityBag(4)EntityData Platform(2)EntityDataControl(3)EntityDataSource(7)Erik Meijer(14)Erik Meijer F#(1)Erlang(1)ESENT(1)ESP(1)eSQL(39)ETags(1)Eucalyptus(2)Excel Cloud Data Analytics(2)Excel DataScope(10)Excel PowerPivot(2)Excel PowerView(1)Exchange(2)Exchange Online(1)Expression Trees(10)Extension Methods(8)Facebook(5)FathomDB(3)Fawcette(1)Fawcette Technical Publications(2)Federated Identities(4)Fiddler(2)Fiddler2(2)Firebird(3)Firefox 4 Beta(1)FireWire(1)FISMA(2)Flickr4Writer(2)FlockDB(1)Force.com(4)Formbuilder(1)Forms(1)Formspring(1)Framework(1)Front Runner(1)FTP(4)Fujitsu(6)GData(2)GDrive(1)Geneva(10)Geneva Framework(21)Geneva Server(6)Geolocation(1)GFS(1)Git(1)GitHub(1)Global Foundation Services(1)Google(45)Google Analytics(2)Google App Engine(92)Google App Engine for Business(17)Google Apps(3)Google Base(9)Google BigQuery(10)Google Cloud Platform(2)Google Cloud SQL(2)Google Compute Engine(11)Google Dart(1)Google Data Services(14)Google Docs(1)Google Fusion Tables(3)Google Gears(3)Google Health(8)Google Marketplace(1)Google Megastore(4)Google Nexus 7(4)Google Spanner(2)Google Web Services(3)GraphDB(3)Grubby Clark(1)Grunt(1)Hadoop(132)Hadoop on Azure(57)Hailstorm(1)HBase(6)HDInsight(20)Health Information Technology(4)HealthVault(23)Heroku(11)Hibernate Shards(1)Hierarchical Updates(1)High-Performance Computing(2)HIPAA(12)HITECH(8)Hive(15)Hortonworks(10)HP(19)HP Cloud Services(10)HPC(15)HPC Server 2008 R2(7)HTML5(10)Huagati DBML/EDMX Tools(2)Huagati Systems(1)Hubpages(2)Hummer Winblad(1)Huron Project(2)Hybrid Clouds(6)Hyper-V(9)Hyper-V Cloud(6)i4o(1)IaaS(4)IBM(26)IBM Pulse 2009(2)IBM Smart Cloud(2)IBM SmartCloud(7)iCloud(2)IDC(2)Identity Framework(4)IEEE-1394(1)IndexedDB(9)Indexes for Objects(1)InfoChimps(1)InfoJet(1)InfoPath(2)Informix(2)Infragistics(1)Infrastructure 2.0(1)Intel(3)IntelliTrace(1)InterLINQ(2)Intuit(1)iOS(6)IPOCO(2)IPython(1)IronPython(7)IronRuby(3)IsMyServerUp(1)ISO 27001(1)ISO/IEC 27001:2005(2)Jasper(7)Java(22)JavaScript(3)JET(2)Jim Gray(6)Jon Udell(19)Joomla(1)JotForm(1)JotSpot(2)jQuery(6)JSINQ(1)JSON(14)JSON Light(3)Julie Lerman(5)Juniper Networks(1)Katmai(9)Known Bugs(1)Labels(1)Lambdas(4)LAMP(2)Lang.NET(1)Lang.NET 2006(1)LDAP(1)LevelDB(1)Lexical Closures(1)LightSwitch(285)LightSwitch HTML Client(1)LightSwitch HTML Client Preview 2(17)LINQ(292)LINQ 2.0(6)LINQ Extensions(4)LINQ to Active Directory(7)LINQ to AD(4)LINQ to ADO.NET Data Services(3)LINQ to Amazon(3)LINQ to Anything(1)LINQ to Atom(3)LINQ to Bing(1)LINQ to CRM(3)LINQ to CSLA(1)LINQ to CSV(1)LINQ to DataReader(1)LINQ to DataSets(24)LINQ to Entities(91)LINQ to Entity Base(1)LINQ to Esent(1)LINQ to Excel(2)LINQ to Expressions(1)LINQ to Financial Markets(1)LINQ to Firebird(1)LINQ to Flickr(1)LINQ to GAC(1)LINQ to Google(1)LINQ to HPC(9)LINQ to JavaScript(3)LINQ to JSON(2)LINQ to LDAP(4)LINQ to LLBLGen Pro(6)LINQ to Lucene(1)LINQ to Matrices(1)LINQ to MDX(1)LINQ to Mesh(2)LINQ to Mock(6)LINQ to MPI.NET(2)LINQ to MSI(3)LINQ to Named Pipes(1)LINQ to NCover(1)LINQ to NHibernate(7)LINQ to Objects(50)LINQ to OneNote(1)LINQ to Outlook(1)LINQ to PHP(3)LINQ to RDF(2)LINQ to Reflection(1)LINQ to REST(26)LINQ to SAP(1)LINQ to SharePoint(12)LINQ to SimpleDB(2)LINQ to SQL(203)LINQ to SQL Azure(1)LINQ to SQL Entity Base(3)LINQ to SSDS(6)LINQ to Stored XML(2)LINQ to Streams(4)LINQ to SubSonic(2)LINQ to TerraServer(1)LINQ to WMI(1)LINQ to XML(96)LINQ to XPO(1)LINQ to XSD(13)LINQ4SP(8)LinqDataSource(26)LINQExtender(1)LinqPad(7)Linus Torvalds(1)Live Framework(1)Live Framework SDK(1)Live Mesh(8)Live Services(2)LiveID(1)LLBLGen Pro(4)Load Testing(3)LoadStorm(3)Longhorn(1)LotusLive(1)LotusLive Engage(1)Lucene.NET(1)ManagedEsent(1)Management Studio Express SP2(1)MapReduce(24)Market Research(1)MarketPlace DataMarket(29)MashSSL(1)Mashups(5)Meet Windows Azure(2)Megastore(1)MEntity(1)Merge Replication(1)MGraph(1)Michael Arrington(2)Microsoft .NET(2)Microsoft Access(8)Microsoft Access 2010(9)Microsoft Access 2013(1)Microsoft Azure(11)Microsoft Biology Framework(1)Microsoft Business(1)Microsoft Cloud Essentials Pack(4)Microsoft Cloud Partner(2)Microsoft Convergence 2010(1)Microsoft Dallas(2)Microsoft Dynamics(7)Microsoft Endpoint Protection(4)Microsoft Excel 2010(1)Microsoft Management Summit 2011(1)Microsoft Management Summit 2012(1)Microsoft Management Summit 2013(1)Microsoft Office(1)Microsoft Partner Network(4)Microsoft Pinpoint(6)Microsoft Platform Ready(3)Microsoft PolyBase(1)Microsoft Research(10)Microsoft Surface Tablet(6)Microsoft Synchronization Services(14)Microsoft System Center(3)Microsoft Velocity(1)Microsoft Vidalia(1)Microsoft Watch(1)Mini-DV Recording Format(1)MiniPC(4)MIX 2011(3)MIX07(7)MIX08(2)MIX09(6)MIX10(4)MLinq(1)MOAX 2007(1)Mobile Services(1)Mocking Frameworks(5)Model-View-Controller(5)Mon.itor.us(3)Monad(1)MongoDB(17)Mono(2)Moq(2)MOSS 2007(2)Mosso(3)MPEG-2(3)MPEG-4(3)MSchema(1)MSDE(1)MSDN(1)MVC(15)MySQL(17)Naming Conventions(1)NCommon(1)NetDataContractSerializer(1)Nexus 7(4)NHibernate(24)NHibernate Shards(2)NIST(4)Node.js(11)NodeXL(1)Northwind(2)NoSQL(27)Novell(1)NuGet(3)NUnit(2)NuoDB(1)NuPack(1)O/RM(25)O/RM Persistence Ignorance(11)OakLeaf(2)OakLeaf Blog(5)OakLeaf Blog Statistics(2)OakLeaf Mobile(1)OakLeaf Systems Privacy Statement(1)OakLeaf Systems Website Terms of Service(2)OAuth(12)OAuth WRAP(2)Object/Relational Modeling(20)ObjectDataSource(1)ObjectSpaces(10)Occasionally Connected Systems(8)Ocean Beach(1)OCS(13)OCS Sync Framework(7)OData(320)ODBC(3)ODBCDirect(1)Office 2007(3)Office 365(9)Office Accounting Express 2007(1)Office Web Apps(1)OGDI(2)OLE DB(2)OopenFlow(1)OOXML(1)Opalis(3)OPath(1)Open Data Center(3)Open Data Center Alliance(1)Open Data Protocol(304)Open Government Data Initiative(1)Open XML(1)OpenStack(16)Oracle(33)Oracle CRM(4)Oracle Database 10g XE(5)Oracle Database 11g(5)Oracle Database 12c(1)Oracle Public Cloud(2)Orcas(64)Orcas Beta 1(10)Orcas Beta 2(4)Orcas June 2007 CTP(2)Orcas March 2007 CTP(22)Orchard CRM(1)ORDER BY(1)OSCON(2)Oslo(8)oSQL(1)Outlook(1)PaaS(7)Parallel LINQ(13)ParallexFX(14)PASS BI Conference 2013(1)PASS Summit 2011(5)Pat Helland(3)Pattern Matching(1)Patterns(2)PBwiki(1)PCI-DSS(9)PDC 2008(8)PDC 2009(5)Performance(4)Performancing(1)Persistence Ignorance(2)Persistence Instance(6)Personal Health Records(15)Personally Identifiable Information(1)PHP(44)PHPAzure(62)PHPLinq(1)PHR(14)Pig(7)Pingdom(14)PinPoint(1)PLINQ(17)Plinqo(1)POCO(9)Pomegranate(1)PopFly(2)PostgreSQL(3)PowerPivot(5)PowerShell(5)Private Clouds(19)Project Apollo(1)Project Austin(4)Project Concero(6)Project Dallas(4)Project Daytona(8)Project Hawaii(1)Project Houston(4)Project Huron(1)Project Siena(1)Project Sydney(3)Public Clouds(4)Puppet(2)Python(9)QCon(2)Quadrant(4)Quadrant Entity Editor(1)Quaere(1)Queries(1)Query Notifications(2)Quest Software(1)QuickBase(5)Qumana(1)Rackspace(19)RavenDB(7)RDA(1)RDO(1)Reactive Framework(4)Real-Time Search(1)Real-Time Stats(1)Red Hat(1)Red Jet(1)Redmond Developer News(4)Redmond Magazine Group(3)Refactoring(1)Remote Data Access(2)Remote Desktop Services(4)RemoteFX(1)ReportViewer(1)Repository Pattern(2)REST(30)Ripley Project(2)Roger Jennings Access Blog(1)RSA(2)RSS(2)RSS 2.0(3)Ruby(10)Ruby on Rails(7)RubyCLR(1)Rx(2)S#arp Architecture(1)SaaS(4)salesforce.com(13)SAML(4)Sample Code(1)San Francisco(1)SAP(2)Sarbanes-Oxley(3)SAS 70(4)Sawzall(1)SB1386(1)Scaffolding(3)SCOPE(1)SDL(1)SDS(48)Security Development Lifecycle(1)SendGrid(1)Sentiment Analysis(1)Serialization(6)Server App-V(2)Server Explorer(1)Service Broker(2)Service Bus EAI/EDI Mapper(1)Service Level Agreements(4)Service Outages(3)Service Provider Licensing Agreement(2)ServiceBus(7)ServiceModel Trace Viewer(2)Sesame Data Browser(1)Sharding(8)SharePoint(12)SharePoint 2010(25)SharePoint 2010 Beta(2)SharePoint 2010 Online(11)SharePoint 2012(2)SharePoint 2013(5)SharePoint Apps(2)SharePoint BCS(2)SharePoint Online(7)SignalR(1)Silverlight(16)Silverlight 2(14)Silverlight 3(5)SimpleDB(21)SkyBox(1)SkyDrive(5)SkyLine(1)SkyMarket(1)SLAs(4)SLINQ(2)SO-Aware(1)SOAP(4)Social Analytics(2)Social Bookmarks(1)SOX(2)Spec#(1)Specification Pattern(1)SPLA(1)Spotlight on Azure (Quest)(1)SQL Azure(525)SQL Azure Backup(1)SQL Azure Data Migration(59)SQL Azure Data Services(41)SQL Azure Data Sync(55)SQL Azure Database(124)SQL Azure Federation(8)SQL Azure Federations(50)SQL Azure Federations Migration Wizare(6)SQL Azure Labs(15)SQL Azure Migration Wizard(26)SQL Azure Reporting Services(11)SQL Azure Services(9)SQL Azure Sharding(12)SQL Azure Spatial Features(2)SQL Azure Vidalia(1)SQL Business Intelligence Services (Azure)(5)SQL Compact(3)SQL Compact v4.0(3)SQL Data Services(51)SQL Express(5)SQL Reporting Services (Azure)(6)SQL Server(16)SQL Server 2005(47)SQL Server 2005 Express(40)SQL Server 2005 Express SP2(6)SQL Server 2005 SP1(4)SQL Server 2005 SP2(9)SQL Server 2008(15)SQL Server 2008 R2(19)SQL Server 2008 R2 Management Studio(7)SQL Server 2008 R2 SP1(4)SQL Server 2012(4)SQL Server CE(32)SQL Server Compact Edition(32)SQL Server Data Services(64)SQL Server Database Publishing Wizard(1)SQL Server Denali(3)SQL Server Denali CTP(3)SQL Server Migration Assistant(6)SQL Server Mobile(4)SQL Server ODBC Driver for Unix(1)SQL Server Reporting Services(1)SQLce(7)SQLce RC1(1)SQLite(4)SqlMetal.exe(2)SqlMethods(1)SQLX(4)Squidoo(2)Squoop(2)SSCE(27)SSDS(63)SSMA(4)SSMA for Access v4.2(1)SSX(1)Starterbase(1)StikiPad(1)StorageClient Library(1)Stored Procedures(6)StreamInsight(1)Structure 2010 Conference(1)Structure 2011 Conference(2)SubSonic(12)SubSonic Makai(1)SugarCRM(1)Sun Microsystems(4)Surfboards(1)Surfing(1)SvcTraceViewer(1)Sybase Fuji Beta(1)Sync Designer(7)Sync Framework(28)Sync Services(38)SyncLINQ(2)System Center App Controller(6)System Center Operations Manager(11)System Center Orchestrator(5)System Center Virtual Machine Manager(5)T-SQL(6)T4(8)Table Storage Services(3)Tableau Software(1)Task Parallel Library(2)TDD(2)Team Foundation Services(2)Tech Ed 2006(1)TechCrunch(2)TechEd 2007(2)TechEd 2008(4)TechEd 2009(1)TechEd Australia 2010(1)TechEd Europe 2010(3)TechEd North America 2011(4)TechEd North America 2012(1)TechEd North America 2013(1)TechEd NorthAmerica 2010(5)Technorati(9)TeraData(1)TerraServer(3)TerraService(3)Terremark(1)Tesla(5)Test Driven Design(2)Test Harnesses(8)Text Template Transformation Toolkit(1)The New Yorker(1)Tivoli(2)TOGAF 9.0(1)TOP (100) PERCENT(2)Traffic Manager(2)TrialTool(1)TVBox(2)Twilio(1)Twitter(4)TypeScript(2)Umbraco(2)Unisys(2)Unit Testing(4)Uptime Reports(34)USB 2.0 On-The-Go Cables(1)Value Objects(1)VB 9(3)VB 9.0(46)VB Express(2)VBA(1)VBx(2)vCloud(1)vCloud Director(2)Velocity(2)Velocity Conference 2011(1)Video Transcoding(4)Virtual Machines(2)Virtual Networking(2)Virtual PC 2007(2)Virtual Server 2005 R2(3)Virtualization(2)VistA(1)VistaDB(3)Visual Basic 9(2)Visual Basic Express(1)Visual C# Express(1)Visual Data Tools(2)Visual Studio(10)Visual Studio 2008 Service Pack 1(6)Visual Studio 2010(6)Visual Studio 2011(3)Visual Studio 2011 Developer Preview(1)Visual Studio 2012(4)Visual Studio 2013(2)Visual Studio LightSwitch(20)Visual Studio Magazine(18)Visual Studio. .NET Framework(1)VMforce(2)VMware(32)Volta(9)VoltDB(1)Vordel Cloud Broker(1)VS 2005(2)VS 2008(18)VS 2008 SP1(11)VS 2010(4)VS 2011(3)VS 2012(2)VS Express(1)VS LightSwitch(8)VSM(2)vSphere(1)WABS(1)WAPA(6)Water Sports(1)WCF(15)WCF Configuration Editor(1)WCF Data Services(20)WCF Message Logging(1)Web Developer Express(2)Web Forms(1)Web Platform Installer 2.0(1)Web Services(6)WebMatrix(6)Wetpaint(1)WF(1)wiki(1)Wikia(1)William Finnegan(1)Windows 7(4)Windows 8(29)Windows 8 Server(2)Windows 8.1(3)Windows AppFabric(22)Windows Azure(776)Windows Azure Access Control Services(8)Windows Azure Active Directory(32)Windows Azure App Fabric Labs(7)Windows Azure AppFabric(179)Windows Azure AppFabric Application Manager(3)Windows Azure AppFabric Applications(5)Windows Azure AppFabric SDK(5)Windows Azure Applications Monitoring Pack(1)Windows Azure Billing(14)Windows Azure BizTalk Services(7)Windows Azure Caching(4)Windows Azure CDN(13)Windows Azure Cloud Services(12)Windows Azure Compute(42)Windows Azure Connect(22)Windows Azure Developer Portal(1)Windows Azure Diagnostics(36)Windows Azure Drive(6)Windows Azure HDInsight Service(16)Windows Azure HPC Scheduler(6)Windows Azure IaaS(6)Windows Azure Logging(5)Windows Azure Logs(15)Windows Azure Management(3)Windows Azure Marketplace(19)Windows Azure Marketplace DataMarket(37)Windows Azure Media Services(17)Windows Azure Mobile Services(56)Windows Azure Notifications(18)Windows Azure Online Backup(2)Windows Azure Pack(9)Windows Azure Platform Appliance(37)Windows Azure Platform Cloud Essentials(5)Windows Azure Platform Live Services(2)Windows Azure Platform Training Kit(6)Windows Azure Pricing(9)Windows Azure Rights Management Service(2)Windows Azure Security(5)Windows Azure Service Bus(16)Windows Azure Service Management(1)Windows Azure Services for Windows Server(1)Windows Azure SQL Database(6)Windows Azure Storage(51)Windows Azure Storage Analytics(9)Windows Azure Toolkit for iOS(1)Windows Azure Toolkit for Windows 8(1)Windows Azure Traffic Manager(5)Windows Azure Virtual Machines(41)Windows Azure Virtual Network(21)Windows Azure Web Sites(40)Windows Blue(2)Windows Communication Framework(6)Windows Developer Previews(2)Windows DNA(1)Windows Experience Index(1)Windows Geneva(6)Windows HPC(5)Windows HPC Server 2008 R2(11)Windows Hyper-V Server 2008 R2(3)Windows Identity Foundation (WIF)(10)Windows Live SkyDrive(1)Windows Live Writer(4)Windows Media Center(1)Windows Notification Services(5)Windows Phone 7(22)Windows Phone 7.1 Mango(2)Windows Phone 7.5 Mango(8)Windows Phone 8(2)Windows RT(4)Windows Server 2003(1)Windows Server 2008(1)Windows Server 2008 R2(6)Windows Server 2012(2)Windows Server 8(7)Windows Server Service Bus 1.0(2)Windows Store Apps(12)Windows Vista(9)WindowsAzure4j(3)WinFS(4)WLW(4)Wndows Live Mesh(1)Woldwide Partner Conference 2010(4)WordPress(1)Workflow 4.0(7)Workflow Foundation(2)Workflow Services 4.5(1)Worldwide Partner Conference 2011(7)Worldwide Partner Conference 2012(3)WPC 2010(1)WPC 2011(7)WPC 2012(2)WPC 2013(1)WPC2010(1)WPF(3)Writely(1)Writely Beta(1)WSS 3.0(6)Wufoo(1)Wyacracker(1)Xamarin(2)XBRL(1)XLinq(32)XML(2)XML Schema(3)XPath(1)XQuery(4)xRM(1)XSD(1)XSLT(2)Yahoo(5)Yarn(1)Zend(2)Berkeley Juneteenth Festival 1998 Historical Web PagesOriginal Web Site Home PageHistory of the Juneteenth CelebratonJuneteenth BibliographyThe Southeast Texas-East Bay Music ConnectionBerkeley Juneteenth Festival Silver AnniversaryThe Berkeley Juneteenth Festival celebrated it's Silver Anniversary on June 24, 2012. From the Berkeleyside blog:

Story: Juneteenth festival celebrates silver anniversary Sunday

Photos: Juneteenths 25th anniversary a great day out in Berkeley

Berkeley Juneteenth Festival Web Site


Miscellaneous LinksOriginal (topical) posts search in reverse date orderOakLeaf site information from Alexa.comMSFT Award for my U.S. Code of Federal Regulations Web ServiceRememberence of Things Past: Fluidyne Instrumentation, Urethane Foam, Artificial Kidneys and FlowmetersCIPS (Canadian Information Processing Society) Connections Interview re Access 2003"Client-Server Application Development Using Microsoft Access 2.0" TechEd 95 Presentation"An (Almost) Lifetime of BASIC" article for Apress's 2001 "VB @ 10 Years" Project"Access Heroes" article for the Tenth Anniversary of Microsoft AccessSquidoo LensesThe Black Scholar - Journal of Black Studies and ResearchNo More U.S. Custom Surfboards? - The effects of Clark Foam's closingTerms of Use/Privacy StatementOakLeaf Systems Website Privacy StatementOakLeaf Systems Website Terms of ServiceCopyright



Original content copyright Roger Jennings2005 - 2013 and licensed under a Creative Commons Attribution 3.0 License.

Content of others quoted in this post is subject to the copyright terms of the originator.

TAGS:OakLeaf Systems 

<<< Thank you for your visit >>>

Websites to related :
MVR-Auto — ремонт и з

  keywords:
description:
Главная Услуги Запчасти Диагностика

age动漫-age动漫网-age动画在线观

  keywords:age动漫,age动漫网,age动画,acg动漫在线观看
description:AGE动漫海量高质量动漫资源下载,新番动漫更新速度极快,动漫排行榜让您第一时间观看最新动漫,手机

OpenWBEM Home Page

  keywords:
description:
Skip to main content. Related sites:

Prediction of Protein Stability

  keywords:
description:
-->

New York Limo, Long Island Limo,

  keywords:New York Limo,Manhattan Limo, Long Island Limo, Atlantic City Limo | Stans Limo,New York Limousine Service, JFK limousine service, Long Islan

HOME - UCMed

  keywords:
description:

Museos y comunicación: educar,

  keywords:
description:
skip to main | skip to sidebarMuseos y comunicación: educar, divertir, emocionarReflexiones desde Valencia

Myibidder - Auction Bid Sniper a

  keywords:management, ebay, sniper, automatic, snipe, bidder, bid, service, auction, sniper
description:Myibidder is an Auction Bid Sniper and Manageme

Amazon Seller Resources and Tool

  keywords:
description:Want to learn how to sell on Amazon? I sell on Amazon every day and I can teach you how. My courses, articles and tips are all d

PhyloTree.org - human mtDNA tree

  keywords:
description:
Please cite the mtDNA tree as follows:vanOven M, Kayser M. 2009. Updated comprehensive phylogenetic tree of global humanmitocho

ads

Hot Websites