Syntax Highlighter

domingo, 23 de marzo de 2014

Cosas que debo llevar al Global Windows Azure Bootcamp

bootcampYa sólo quedan unos cuantos días para vernos en el evento Global Windows Azure Bootcamp y parece ser que estamos que lo petamos, hemos tenido que colgar el cartel de no quedan más entradas. Aún así, puedes registrarte a la lista de espera, ya que vamos a realizar un proceso de comprobación de asistencia para liberar las plazas que no vayan a utilizarse.

Recuerda que aparte de las estupendas sesiones que transcurrirán a lo largo del día, vamos a hacer historia con la mayor granja de servidores global colaborando en un proyecto de investigación científica para el diagnóstico temprano de la diabetes Tipo 2. Va a ser algo grande y que dará mucho que hablar sobre el potencial de la computación global, algo de lo que podrás sentirte orgulloso de haber puesto tu granito de arena y haber colaborado en ello.

HF-and-Diabetes-Glycosylation-150.small_

Pulsa aquí para conocer más acerca del Global Windows Azure Bootcamp GlyQ-IQ lab

Para poder participar y colaborar en esta investigación es necesario que te prepares para ello. ¿Ya lo tienes todo listo? ¿Te acuerdas de lo que tenías que llevar al evento?

Activar suscripción de Windows Azure

No te preocupes, que no tienes que saber nada de biología molecular ni ser ingeniero en computación distribuida. Tan sólo tienes que llevar una cosa: una suscripción activa de Windows Azure.

Hay diversas formas de conseguir una suscripción de Windows Azure te animamos a hacerlo desde ya para evitar retrasos en la activación de la misma el día del evento:

  • Activar suscripción gratuita de 30 días.- siguiendo este enlace podrás activar una suscripción gratuita de un mes con un crédito de 150€, más que suficiente para desplegar hasta 20 servidores durante el transcurso del evento sin tener que poner un céntimo de tu bolsillo;

Activar suscripción gratuita de Windows Azure

  • Activar los beneficios de tu suscripción a MSDN.- los suscriptores de MSDN tienen como beneficio una suscripción a Azure de 150$ mensuales, cuyos recursos pueden también utilizarse para este evento. Hot Tip: ¿la empresa donde trabajas es Partner de Microsoft? Comprueba los beneficios de las competencias adquiridas y activa las suscripciones MSDN asociadas a las mismas!

OPCIONAL: llevar un portátil es opcional, aunque recomendable. No necesitarás llevar nada preinstalado ni instalarás nada allí. Simplemente se tratará de desplegar un servicio en Azure a través del mismo navegador, algo de lo que daremos más detalles en el día del evento. El área de “Azk the Expert” estará disponible a lo largo del día para solucionar cualquier problema que tengas.

Vamos a hacer historia. ¡Nos vemos el sábado!

domingo, 9 de marzo de 2014

¡Ya se acerca el Global Windows Azure Bootcamp!

bootcamp-300x202El próximo 29 de Marzo se va a desarrollar a lo largo del día y en cerca de 140 localizaciones a nivel del globo el mayor evento global sobre Windows Azure. Se trata de pasar un día aprendiendo y compartiendo conocimientos sobre la plataforma en la nube de Microsoft, a la vez que de forma simultánea se realiza una investigación para el diagnóstico temprano de la diabetes tipo 2.

Desde la comunidades técnicas de España hemos querido hacer un evento muy especial y lo vamos a concentrar en Madrid en la sede de Microsoft, donde estaremos todos los especialistas y MVPs que actualmente estamos día a día trabajando con la plataforma Azure.

La realización de este evento con los líderes de la comunidad de Windows Azure en España manteniendo la asistencia gratuita no sería posible sin la dedicación y duro trabajo de los presentadores, organizadores y contribuciones económicas de otras organizaciones para ayudar a financiar la logística del mismo. Muchas gracias a todos ellos.

El registro a este evento es gratuito -¡Sí! ¡Gratis!- y se realiza a través de Microsoft World Wide Events. Regístrate en el enlace siguientepara estar en Madrid y atender en persona al Windows Azure Bootcamp 2014 en Madrid. ¡No tardes, las plazas son limitadas!

Registro al Global Windows Azure Bootcamp – Madrid

Para más información, visita los siguientes enlaces:

¡Allí nos vemos!

viernes, 31 de enero de 2014

Looking for a Surface 2 Pro 256GB using the Azure Cloud Power

SurfaceSince some weeks ago I have been trying to find a way of buying a Surface 2 Pro 256GB to definitively replace my development laptop. Some videos I have seen on YouTube like having 4 external displays brought my attention and after asking Joe Brinkman and Alberto Diaz about their experience working with the tablet as a dev machine both answered that definitively can replace my laptop (of course with the docking station). The final decision was taken after seeing another thread in the Surface Forums about the gaming experience like playing Call of Duty Ghosts on a Surface Pro 2.If that beast can move those types of games, for sure that can replace my current laptop.

And the problem started…

Looking for a Surface 2 on the stores

I have been trying to find a Surface 2 Pro 256GB in order to have the 8GB RAM in almost all the stores on Internet, and was really surprising that this model is Out of Stock in almost all of them. When finally found that the Microsoft Store at UK, I started to evaluate the problem of not getting the correct warranty if I’m in another country, different AC plugs than Spain, etc. The price it’s not low so the risk of having a problem without a store/warranty was not an option.

Tweet

So the final decision was to buy it on Spain but same problem. Out of stock. I call the Microsoft Store and they didn’t know about when they could arrive. Bad thing.

I’m a developer, I don’t like doing things twice

I have been checking the Microsoft Store website twice a day since two weeks ago, and I was starting to be tired of doing it. So I was wondering if I could automate the process and receive some type of alert when the devices were in stock again. And then I found the how to create my new minion.

Looking into how the Store page works, I noticed that a GET WebAPI call is done to show the availability of the product:

GET-WebAPI

More interesting was the result of that WebAPI call, since it’s an XML with the stock status for that product:

StockStatus

Idea! Idea! Idea!

If I didn’t want to check the website twice a day, what I would like was just to receive an alert of when the devices were in stock again. Would be nice to receive an e-mail on such event, and would love get an SMS on my mobile!

I always compare Azure and other cloud services like a Lego store. The pieces are there to build solutions, you only need to know about them. So let’s put some pieces in place:

  • Azure Mobile Services: in order to have a scheduled task looking for the device stock at the store. There are other options but the scheduler provided by Mobile Services is sufficient for this purpose and also free when using 1 scheduled task;
  • SendGrid: in order to receive an e-mail notification for stock changes. Again, there are other options but SendGrid offers a free tier for Azure subscribers that allows to accomplish this also for free;
  • Twilio: in order to receive SMS notifications for stock changes. Again, you can signup Twilio for free –no credit card needed- and send up to 1000 SMS (Azure subscribers get $10 when adding credit later to Twilio).

Cool, I love free stuff.

Adding some code to a scheduled task

So with the pieces in mind, the API keys from SendGrid and Twilio in my Notepad++, I started to create an empty Azure Mobile service through the management console.

MobileService

The only thing inside the mobile service is just an scheduled task configured to run once per hour (my minions must work harder than me!!).

ScheduledTask

Finally the script code for the scheduler task. Note that I have hidden the API keys, e-mails and phone numbers:

function CheckAvailability() {
try {
var url = "http://surface.microsoftstore.com/store/mseea/es_ES/DisplayPage/id.ProductInventoryStatusXmlPage/productID.287012200?_=13";

var request = require("request");

request(url, function(error, r, body) {
if (error) { return console.error(error); }
if (r.statusCode != 200) { return console.error(r); }
var xml2js = require('xml2js');
var parser = new xml2js.Parser();
parser.parseString(body, function (err, result) {
if (err) { return console.error(err); }
if (result.InventoryStatusResponse.availableQuantity != "0") {
sendNotifications(result.InventoryStatusResponse);
} else {
console.warn(result.InventoryStatusResponse.inventoryStatus);
}
});
});
}
catch(e) {
console.error(e);
}
}

var SendGrid = require('sendgrid').SendGrid;

function sendNotifications(inventoryStatusResponse) {
sendSMS(inventoryStatusResponse);
sendEMail(inventoryStatusResponse);
}

function sendSMS(inventoryStatusResponse) {
var httpRequest = require('request');
var account_sid = "Your_Twilio_Account_SID_here";
var auth_token = "Your_Twilio_Auth_Token_here";
var from = "+345550000";
var to="+3465550001";
var message = 'The new Surface 2 Pro 256GB is now available at Microsoft Store. Units available: '
+ inventoryStatusResponse.availableQuantity;

// Create the request body
var body = "From=" + from + "&To=" + to + "&Body=" + message;

// Make the HTTP request to Twilio
httpRequest.post({
url: "https://" + account_sid + ":" + auth_token +
"@api.twilio.com/2010-04-01/Accounts/" + account_sid + "/Messages.json",
headers: { 'content-type': 'application/x-www-form-urlencoded' },
body: body
}, function (err, resp, body) {
if (err) { return console.error(err); }
console.log(body);
});
}

function sendEMail(inventoryStatusResponse) {
console.log('Surface 2 Pro 256GB available at Microsoft Store. Units available: '
+ inventoryStatusResponse.availableQuantity + '. Sending notifications...');

var api_user = 'Your_SendGrid_ApiUser';
var api_key = 'Your_SendGrid_ApiKey';
var sendgrid = new SendGrid(api_user, api_key);
sendgrid.send({
to: 'foo@mydomain.com',
from: 'bar@mydomain.com',
subject: 'Surface 2 Pro 256GB available at Microsoft Store!',
text: 'The new Surface 2 Pro 26GB is now available at Microsoft Store. Units available: '
+ inventoryStatusResponse.availableQuantity
}, function(success, message) {
// If the email failed to send, log it as an error so we can investigate
if (!success) {
console.error(message);
}
else {
console.log('Email notification sent!');
}
});
}

Finally, by commenting the line checking for the availableQuantity != “0”, I got my initial notifications arriving to my devices:


EmailNofitication


Phone


Conclusion


The implementation shown above can be considered a proof of concept –BTW, probably I should not use the Store WebAPI without Microsoft confirmation- of things you can you can do by integrating different cloud services available today, and start using them for free.


The world has changed and is now cloud-connected.


Have you started to think cloud?


Saludos y Happy Coding!

martes, 31 de diciembre de 2013

Adding NewRelic support to a DNN Website running as an Azure Cloud Service

screensIn my last blog post announcing the DNN Azure Accelerator 2013 Q4 release, I commented that I was going to publish some posts about how to setup external startup tasks in order to include persistent changes on the cloud service instances without the need of rebuilding the cloud service package.

I will start today with a walkthrough to show how to add NewRelic monitoring to the deployed DNN instance by configuring an external startup task, step by step. I will be doing the same for MS Application Insights as well as other tips and tricks to increase the performance on IIS, just be a little patient these days Smile.

Signup for NewRelic Standard for free

One of the advantages of signing up for Azure, is that there is a lot of free stuff included with your subscriptions, thanks to the agreements done with third party companies like NewRelic. In this case, with each Azure subscription NewRelic gives a Standard subscription for one application giving you fully functional app performance management (includes server and real user management).

In order to activate your NewRelic Standard for free, follow these steps:

  1. In the Windows Azure Management console, click on “New > Store” to open the Store dialog. Once there scroll down until you find the NewRelic, and click on the Next button

    NewRelic Signup
  2. Select from the dropdown list the datacenter location where you want to use the services. You should use the same location where your DNN site will reside –or is currently running on.
    NewRelicSignup2
  3. Click next to review the purchase of the free service, and click Finish to start provisioning.
    NewRelicSignup3
  4. After a few seconds, the service will appear in the “Add-ons” section of the Management Portal. The most interesting links at the bottom will: show your current API Key and License Key, needed to deploy the agents later; redirect to the NewRelic Management portal where you can monitor your site once it’s deployed.
    NewRelicProvisioned

    NewRelicConnectionInfo
    NewRelicPortal

At this point, you have provisioned your free NewRelic Standard subscription. Let’s start configuring the DNN Cloud Service to start reporting in the “Applications” section.

Creating the external startup task for the DNN Azure Accelerator

In the NewRelic’s website, you can found how to modify an Azure cloud service package to start monitoring, and in fact, you can go that way by downloading the DNN Azure Accelerator source code from CodePlex. But in this case, instead of rebuilding the cloud service package with Visual Studio, what we are going to use is the new external startup task feature introduced in the latest release.

The steps to build the NewRelic external startup task are in summary:

  • Create a PowerShell cmdlet that will be executed on the role startup
  • Zip the cmdlet with the NewRelic’s agent and upload it to a public URL location
  • Specify the URL and License Key parameters in the “Startup.ExternalTasks” and “Startup.ExternalTasks.KeyValueSettings” configuration settings.

Let’s see one by one.

Create the PowerShell cmdlet

NewRelic provides two different type of agents depending on what you are going to monitor: the .NET Agent, that collects information of your .NET application, real time user monitoring, etc.; and the Server Agent, that collects information from a virtual machine perspective like CPU, memory, running processes, etc.

In this case we will simplify the PowerShell cmdlet to configure only the .NET Agent, but with some modifications you can deploy the server agent as well. Note that for the server agent you would need more than the free tier if you deploy more than one instance on the cloud service (I’m not 100% sure of this, something to ask to NewRelic’s sales support).

The following PowerShell script will install the NewRelic’s .NET Agent into a cloud service running the DNN Azure Accelerator. The license key, the application description and environment are taken from the new Role configuration setting “ExternalStartupTask.KeyValueSetting” that was introduced in the latest build. This value is a collection of key/value pairs, semicolon separated.

#    New Relic installation script for DNN Azure Accelerator (cloud services) - v2.18.35.0
#    This script install only the New Relic .NET Agent. The license key, the application description 
#   and environment are taken from the Role configuration setting "Startup.ExternalTasks.KeyValueSettings" 
#   (it's a collection of key=value pairs, separated by semicolon).

$scriptPath = split-path -parent $MyInvocation.MyCommand.Definition;
$newRelicAgentInstallationPath = Join-Path $scriptPath "NewRelicAgent_x64_2.18.35.0.msi"
$logPath = Join-Path $scriptPath "NewRelicInstall.log"

[void] [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.WindowsAzure.ServiceRuntime")

# Writes in the installation log
function Append-Log($text)
{
    $((Get-Date -Format "yyyy-MM-dd HH:mm:ss") + " - " + $text) >> $logPath
    foreach ($i in $input) {
        $((Get-Date -Format "yyyy-MM-dd HH:mm:ss") + " - " + $i) >> $logPath
    }

}

# Retrieves configuration settings from the cloud service configuration
function Get-ConfigurationSetting($key, $defaultValue = "")
{
    if ([Microsoft.WindowsAzure.ServiceRuntime.RoleEnvironment]::IsAvailable)
    {
        Append-Log "The Role Environment is available. Looking the setting: " + $key
        return [Microsoft.WindowsAzure.ServiceRuntime.RoleEnvironment]::GetConfigurationSettingValue($key)
    } else {
        Append-Log "The Role Environment isn't available"
    }
    return $defaultValue
}

# Retrieves the external startup key/value pairs        
function Get-StartupTaskKeyValueParameters()
{
    $setting = Get-ConfigurationSetting -key "Startup.ExternalTasks.KeyValueSettings"
    Append-Log $("Startup.ExternalTasks.KeyValueSettings: " + $setting)
    $kvSettings = @{}

    $matches = [regex]::Matches($setting, "\s*(?[^\=\s]+)\=\s*(?[^\;]+)(\;|$)", @("Ignorecase"));
    foreach ($match in $matches) {
        $kvSettings.Add($match.Captures[0].Groups["KEY"].Value.Trim(), $match.Captures[0].Groups["VALUE"].Value.Trim());
    }

    return $kvSettings
}

# Installs a .msi file
function Install-MsiFile($msifile, $arguments)
{
    Start-Process `
         -file  $msifile `
         -arg $arguments `
         -passthru | wait-process
}

Append-Log "Getting startup task parameters..."
$settings = Get-StartupTaskKeyValueParameters
$licenseKey = $settings["NewRelic.LicenseKey"]
if (!$licenseKey) {
    Append-Log "ERROR: license key not specified. The NewRelic installation cannot be performed"
    Break
}
Append-Log $("License key: " + $licenseKey)

# New Relic Agent installation:
Append-Log "Installing New Relic .NET agent..."
Install-MsiFile $newRelicAgentInstallationPath $(" /norestart /quiet NR_LICENSE_KEY=" + $licenseKey + " INSTALLLEVEL=50 /lv* E:\nr_install.log")

# Modify the configuration file (application name and host in case we are in staging)
Append-Log "Changing the .NET agent configuration file..."
$path = Join-Path (get-item env:"ALLUSERSPROFILE").Value "New Relic\.NET Agent\newrelic.config"
[XML]$newRelicConfig = Get-Content $path

# Application name:
$newRelicConfig.configuration.application.name = $(if ($settings["NewRelic.AppDescription"]) { $settings["NewRelic.AppDescription"] } else { "My DNN Website" })
# Log level (info by default). We will set this to warning
$newRelicConfig.configuration.log.level = "warning"

# If we are in staging, we have to set the staging host
if ($settings["NewRelic.Environment"] -eq "Staging") {
    $newRelicConfig.configuration.service.SetAttribute("host", "staging-collector.newrelic.com")
}
$newRelicConfig.Save($path)

# Restart IIS in order to load the new configuration
Append-Log "Restarting IIS..."
IISRESET >> $logPath
NET START W3SVC >> $logPath

Append-Log "Done!"


Zip the .NET agent and the script in one file


Note that in the previous script, the .NET agent .msi file was the release 2.18.25.0, that can be downloaded via NuGet. If you keep this script up to date, you only need to change the $scriptPath variable to match the latest version available. Also note that the script filename must be in the format “Task???.ps1” to tell the DNN Azure Accelerator that must execute this task.


You can download the full zipped external task from this URL:



Specify the settings in the service configuration file


Once you have uploaded the .zip file to a public location –test the URL from your browser first to verify that works-, you need to specify the following settings in the service configuration:



  • “Startup.ExternalTasks”: specify the public internet URL from where the .zip file containing the external startup task will be downloaded
  • “Startup.ExternalTasks.KeyValueSettings”: parameters for the NewRelic external startup task in the following format:

    ”NewRelic.LicenseKey=<LICENSEKEY>; NewRelic.AppDescription=<APPDESCRIPTION>; NewRelic.Environment=<DEPLOYMENTSLOT>”

    where the:
    <LICENSEKEY> is your NewRelic license key that appears in the “Connection info” window of the addon in the Azure Management portal (see step 4 of NR provisioning above)
    <APPDESCRIPTION> is a descriptive name for your DNN deployment
    <DEPLOYMENTSLOT> is the deployment slot of your cloud service: Production | Staging

You can specify these values on a running deployment with the latest DNN Azure Accelerator release and all the configuration will be done automatically, since the modification of the external startup tasks settings recycles the current worker role agents executing again all the startup tasks –the operation will take some minutes to finish.


CSCFGFile0


If you are going to update your deployment from a previous version using a deployment upgrade, you can use the preferred service configuration files from the “/packages” subfolder. Note that you will need to manually replace the “@@” variables in the .cscfg. You can use the one that was previously uploaded to Azure Storage in the “dnn-packages” container as a guide.


Deploy the service with the Accelerator’s wizard


If you are going to deploy a complete new cloud service or you are going to update your current deployment by redeploying the service using the Accelerator wizard, you will need to manually specify the NewRelic startup tasks settings before running the wizard. In order to do this, open in the “/packages” subfolder the .cscfg file that you will use later for deploying in the wizard. As example, if I want to use the “Medium” packages to deploy “Medium-Sized” instances, I need to edit the “/packages/DNNAzureSingleAndMedium_2013Q4.cscfg” file –just use notepad.exe to edit these files-, and locate the “Startup.ExternalTasks” entries, and fill the settings with the values specified in the previous step:


 CSCFGFile1

CSCFGFile2

Now run the wizard and follow all the steps until you have your site running. A few minutes later you will notice that in the NewRelic management portal the application starts reporting:


NewRelicApp


Conclusion


In this article we can follow how we can easily customize the cloud service deployments by using external startup tasks. The example shows how to add the New Relic .NET monitoring agent to a DNN instance running on Azure cloud services, without rebuilding the cloud service package.


Don’t miss the following article where I will be posting a similar one to add Microsoft Application Insights in the same way.


Saludos y Happy coding! and BTW, Happy New Year friends!!!

domingo, 22 de diciembre de 2013

DNN Azure Accelerator 2013 Q4 Released

cloudWithOThe new version of the DNN Azure Accelerator 2013 Q4 is already available for download with new features and fixes. The package is available as usual at CodePlex:

http://dnnazureaccelerator.codeplex.com/

In this version, the most interesting feature is the ability of specifying the location of external startup tasks without the need of rebuilding the cloud service packages. This functionality was best described in a previous post called “Configuring external startup tasks on an Azure cloud service”.

In the next days I will be posting some examples of startup tasks, like automating the installation of NewRelic or MS Application Insights agents, modifying IIS performance settings, etc.

Release Notes

New Features

Fixes

  • Fixed an issue on the web role startup, causing an cycling error on deployments using the latest Guest OS agent releases (the symbolic link was not being correctly deleted from the temporal storage)
  • Added "Generate security audits" local security policy for the DNN appPool (needed by WCF services)
  • Changed defualt VHD size to 32GB on the wizard
  • Fixed the password verification on the wizard UI to allow passwords longer than 16 chars

Un saludo & Happy Coding!

Related Posts Plugin for WordPress, Blogger...