Last night I tried some Louisiana fried chicken and went for a quick walk along the Mississippi River near Woldenberg Park. Retiring early for the evening I joined fellow competition winner Adam Webster for pre-conference breakfast at the convention centre. According to the TechEd kitchen so far over eighty-thousand pieces of bacon have been served. As I’ve contributed to that statistic, some serious dieting will need to follow my visit to New Orleans!
‘Microsoft System Center Tips from the Field’ was the first “Birds of a feather” type session I attended, delivered in question and answer audience participation form. Gordon McKenna is Director and Chief Architect at a UK-based System Center Specialist. He quickly launched into prompting discussion around System Center Operations Manager (SCOM). Removing the infamous 2007 Root Management Server (RMS) role has been a welcome move by Microsoft but has introduced a new problem. There’s a requirement for less than 5ms latency between the Management Servers, and between them and database. To overcome this remote sites should only use Gateway servers. The improved Exchange 2013 management pack design and RMS emulator was also covered.
McKenna admitted he’d not performed an in place upgrade of Data Protection Manager (DPM) himself, but said it was a hidden gem that organisations should consider. When replacing tape backup with Windows Server 2012 and Azure it is possible to choose a specific data centre. That should dispel the myth that the data storage location is unknown.
One key question: what is the required manpower to deploy and manage Service Manage?. It comes out of the box with predefined definitions. Unlike SCOM it doesn’t require daily maintenance. The service manager portal is a SharePoint web part for basic user ticket creation. For any detailed analysis the full client is the only option. Microsoft work closely with Grid Pro who provide a full blown web and mobile HTML5-based client.
Several illustrative example of Service Manager using Orchestrator workflows were shown. One example produced a problem ticket once a certain incident had occurred several times, thus adding problem management to the flow. Opting for a more traditional session I headed to New Orleans Theater C to hear about Microsoft’s On-Premises IaaS direction.
‘Enabling On-Premises IaaS Solutions with the Windows Azure Pack’, presented by Marc Umeno and Eric Winner, was next on my agenda. It began with the explanation that consistent UI, APIs and PowerShell across Azure, on-premise and hosted was Microsoft’s goal here. When an audience question was later raised concerning Azure it was apparent the company is succeeding. This was about IaaS and not Microsoft’s cloud platform. Umeno’s demonstration using Tenant Portal involved adding a new VM server role. The key point was it is possible to publish and take it back offline to edit quickly and easily.
Remote Console Accees for Tenants requires RDPTLSv2 RDP client and System Center R2. VMM is the main piece of System Center along with SPF in this context. Gallery items are delivered as one package containing both .ResdefPkg (viewdef definitions) and .ResextPkg (JSON script and App payload) files.
Eric Winner said that the Windows Server, IIS and Stocktrader example Gallery packages will be released initially. Microsoft will also supply authoring and best practice guides at release but does not have plans for any authoring toolset. So far feedback received is positive with customising JSON scripts as the main pain point. During the session I’d checked my Twitter feed to read that Server 2012 R2 was now available in the TechEd Hands-on Labs hall. Woot!
The ‘Windows Server 2012 R2: Enabling Windows Server Work Folders’ lab was only available on computers in the blue area. After struggling with registration in the silver area I asked one of the many on-site helpers who said labs were spread across hundreds of designated workstations. The computers are identical but Microsoft purposely restrict access based on the expertise of the support staff in the vicinity. Quite a clever arrangement but not too obvious. I clicked my way through the 20-minute lab but cut it short to dash to the biggest touchscreen I’ve ever seen.
Tim Bakke’s session on ‘Understanding Immersive Productivity and Collaboration Experiences with Perceptive Pixel Devices’ dazzled. What this overly complex title translated to was seeing the biggest and best touchscreen in action. In November 2012 Steve Ballmer said: “We’re all out devices and services now” and this screen was that statement’s fruition. Microsoft acquired Perceptive Pixel technology last year. Its multi-touch hardware and software technology was first demonstrated at the TED conference in 2006, one year prior to Apple’s multi-touch iPhone release. The device recognises the difference between touch and pen while discarding unwanted palm contact.
Bakke demonstrated how Perceptive Pixel elegantly works with Office 2013 while describing the superiority of its Projected Capacitance technology (the same technology used in modern smartphones). Similarly the screens, available in 55” and 80”, use Gorilla glass and are currently available for sale in the US only. I spoke with Bakke after the session about Australian availability and he’ll need to get back to me. At a retail price of$US7100 and $US21000 respectively they’re a fraction of the six figures asked pre-acquisition.
During the break I enjoyed a mustard-covered fresh pretzel and sought to charge my laptop. The convention centre has plenty of powerpoints which had me wondering where New Orleans’ power comes from. Half of Louisiana’s electricity is generated from natural gas, while coal provides about a quarter of the state’s electricity. The state’s two single-reactor nuclear power plants produce most of the rest. Unfortunately the state’s environmental policies are lacking. A session with reference to George Orwell’s 1984 then caught my eye.