Introducing backup for Azure file shares

Today, we are excited to announce the public preview of backup for Azure file shares. Azure Files is a cloud-first file share solution with support for industry standard SMB protocol. Through this preview, Azure Backup enables a native backup solution for Azure file shares, a key addition to the feature arsenal to enable enterprise adoption of Azure Files. Using Azure Backup, via Recovery Services vault, to protect your file shares is a straightforward way to secure your files and be assured that you can go back in time instantly.

Backup for Azure File Shares

Key features

  • Discover unprotected file shares: Utilize the Recovery Services vault to discover all unprotected storage accounts and file shares within them.
  • Backup multiple files at a time: You can back up at scale by selecting multiple file shares in a storage account and apply a common policy over them.
  • Schedule and forget: Apply a Backup policy to automatically schedule backups for your file shares. You can schedule backups at a time of your choice and specify the desired retention period. Azure Backup takes care of pruning these backups once they expire.
  • Instant restore: Since Azure Backup utilizes file share snapshots, you can restore just the files you need instantly even from large file shares.
  • Browse individual files/folders: Azure Backup lets you browse the restore points of your file shares directly in the Azure portal so that you can pick and restore only the necessary files and folders.

Core benefits

  • Zero infrastructure solution: Azure Backup creates and manages the infrastructure required for protecting your file shares. No agents or virtual machines (VMs) need to be deployed to enable the solution.
  • Comprehensive backup solution: Azure Backup helps you manage the backup of Azure Files as well as Azure IaaS VMs, SQL Server running in IaaS VMs (preview), and on-premises servers. Backup and restore jobs across all workloads can be monitored from a single dashboard.
  • Directly recover files from the Azure portal: Apart from providing the ability to restore entire file shares, Azure Backup also lets you browse a recovery point directly in the portal. You can browse all the files and folders in a recovery point and choose to restore necessary items.
  • Cost effective: Backup for Azure Files is free* of charge during the preview and you can start enjoying all its benefits immediately.
  • Coming soon: Azure Backup has lined up an amazing list of features for backing Azure file shares and you can expect an update from us as early as next month. Stay tuned!

*Azure file share snapshots will be charged once the snapshot capability is generally available.

Get started

Start protecting your file shares by using the Recovery Services vaults in your region. The new backup goal options in the vault overview will let you choose Azure file shares to back up from storage accounts in your region. 

Backup Goal


Introducing Box GxP Validation: Compliance for an Agile Cloud

Organizations in the life sciences, including pharmaceuticals, biotech and medical device companies, are rapidly transforming. Outsourcing is increasingly commonplace, and a continuous rise in M&A and joint ventures is fueled by a desire to rationalize portfolios and speed up innovation to bring therapies to market. While effective collaboration and speed to market remain critical for market leadership, firms can experience regulatory fines, legal exposure and loses in revenue without effective risk management of intellectual property, and proper information governance on regulated systems.

Due to this complexity, life sciences organizations have not been able to move their content to the cloud as aggressively as they would like. Many maintain legacy on-premise ECM systems in order to meet stringent guidelines issued by the FDA and other regulatory bodies on how information is stored, managed, and distributed when it pertains to drug development, medical device development, clinical trials, and patients. These systems, inflexible to user needs and time-consuming to maintain and validate, created inefficient silos between regulated and unregulated information, and slowed collaboration and innovation.

Manage regulated and non-regulated content on a single cloud

At Box, our mission is to help organizations transform the way they work, even in the face of all this complexity. Today, we are excited to announce Box GxP Validation, an innovative approach for maintaining GxP compliance in the cloud which enables organizations to leverage a single secure and compliant Cloud Content Management platform to manage both unregulated and regulated content. Life sciences companies can now create, collaborate, manage, distribute and archive regulated content associated with clinical development and manufacturing processes.

Box GxP Validation provides customers a Validation Accelerator Pack (VAP) and daily testing to qualify and maintain compliance of their Box instance. Box's innovative validation methodology accelerates initial validation and lowers risk via use of daily automated tests to assure a continued state of compliance for the Box platform. With access to regulated and non-regulated content in single repository that is highly secure, compliant and accessible anywhere, customers can find the information they need, whenever they need it, in a cost effective manner.


As part of Box GxP Validation, Box is partnering with USDM, a leading risk management, technological innovation and business process optimization firm for life sciences, to provide Cloud Assurance, a service that validates changes in the production environment of Box. USDM can also work with customers to ensure their own configurations and customizations are unaffected by these changes.

One Cloud Platform for a Full Range of Use Cases

Box GxP Validation creates new opportunities for customers to create, manage, collaborate and distribute regulated content in Box as they work cross-functionally within their organization or with external business partners that are critical to the clinical development and drug manufacturing processes. We're excited to enable new life sciences use cases with Box, including:

  • Exchange of clinical content between sponsors, CROs, CMOs, and investigator sites
  • Collaboration and exchange of regulated content during joint development or M&A activities
  • Secure archival of SOP's and clinical study documentation
  • Compliance with 21 CFR Part 11 requirements by integrating Box with eSignature providers
  • Use of Box as a compliant content layer with life sciences ISV and SI partners

Today, 800+ life sciences organizations are Box customers, including Eli Lilly, Boston Scientific, GlaxoSmithKlein, AstraZeneca, Boston Scientific, Edwards Life Sciences, Daiichi-Sankyo and Shionogi, leveraging Box to securely and compliantly manage day-to-day operations, research, sales, marketing and global product launches. With the announcement of Box GxP Validation, we will be able to transform the way these organizations and many other work - with both regulated and unregulated content.

To hear more about this new game-changing model for maintaining GxP compliance in the cloud, register for our upcoming webinar, Meet Box GxP Validation: Regulated and unregulated content, now in the same cloud. In the meantime, read up on the Box GxP Validation methodology. 

Enabling new compliance capabilities in life sciences is just one example of how Box recognizes the importance of compliance strength in the cloud. Look out more compliance strength from Box, coming soon.


All the passwords in the world will not protect our data now

The only way to fully fix the microchip flaw would be a mass recall of almost every device on the planet

For years, computer experts have given us a baffling list of things we have to do to ensure our most precious data is kept safe: Use passwords with numbers and capital letters; no wait, use longer ones, and don’t forget to come up with different codes for each account. We’re now encouraged to use techniques such as two-factor authentication and fingerprint scanners to ward off hackers, and the companies we trust with our photos and messages go to even greater lengths, employing military-grade security and secrecy around the huge data centres where they are stored.

Last week, we learnt that all this is not enough. A critical flaw in billions of microchips that power everything from our mobile phones to corporate supercomputers has emerged, allowing hackers to gain access to private files when we do something as innocuous as visiting the wrong website. At first, it was believed that the bug covered merely computers using chips designed by American giant Intel and would be solved by a technical update; but it has since emerged that the problem is far more widespread, extending to the devices in our pockets, and not fixed as easily as hoped.

The hack, which takes advantage of the way microchips process instructions from computer programmes, affects computers that date back two decades. Even products made by Apple  which is known for priding itself on security  are vulnerable.

The fact that there are no known examples of cybercriminals exploiting the “Meltdown” and “Spectre” bugs is little consolation. As soon as the existence of the flaws emerged, hackers will have begun working on a way to exploit them. It is possible that such weapons have been in the works for months, since the bugs were first uncovered in June last year.

Although they have been kept under wraps by tight-knit security teams since then, selling this knowledge to the right bidder — a rogue state, for example — could be worth millions.

The last 12 months have seen major cyber attacks mounted with growing frequency, from the “WannaCry” outbreak launched from Pyongyang that crippled parts of the National Health Service in Britain last March, to the hack of Uber’s servers that stole the data of 57 million people. It may be only a matter of time until we see this flaw exploited.

The tech companies we entrust with increasing amounts of data are rushing to introduce software updates. Apple said some of the issues had already been patched, and it would come out with further updates in future.

But  and this is the crucial point  these solutions are mere sticking plasters. The startling, and unprecedented, thing about this bug is that there is almost no way to defend against it. By going to the core of our computers, it renders the defences we have come to trust  passwords, anti-virus programmes, encryption useless. Follow all expert advice and you are still vulnerable. The only way to fully fix it would be a mass recall of almost every device on the planet.

The microchip companies that sacrificed security for speed are rightly suffering from this outbreak. But the lesson to take is this: The things we store on our computers, that we photograph on our phones or that we send to others, are all potentially compromised. No matter how much we keep up our guard, a digital paper trail exists somewhere, and it can be located with the right tools. It is a trade-off we may be willing to accept for all the benefits that digital technology brings us, but this latest security scare is a wake-up call. It should reset our understanding of what is safe, and more crucially, what is not.



Microsoft introduces a free new tool to get another edge in the cloud war with Amazon

·         Microsoft Azure Migrate is a free tool, launching Nov. 27th, that will make it easy for VMware customers to bring their applications to the cloud.

·         It'll help boost Microsoft's big advantage versus Amazon in the cloud war: Microsoft's wealth of experience selling to even the largest businesses.


On Tuesday, Microsoft is expected to unveil Azure Migrate — a free tool to make it easy for customers to bring their existing applications and data from their own servers up into the Microsoft Azure supercomputing cloud. 

At its November 27th launch, Azure Migrate will support shunting up VMware-based applications. VMware is very popular in the Fortune 500 and beyond, giving Azure Migrate a broad audience right off the bat. And it fits well with Microsoft's big cloud edge, which is that it has so much experience selling into even the largest businesses.

The Microsoft Azure cloud, like its key rival at Amazon Web Services, gives customers access to fundamentally unlimited supercomputing power on a pay-as-you-go basis. Large and small companies alike are drawn to these so-called "public clouds" for the cost-savings and performance improvements they often bring.

There are other cloud migration tools on the market, including from startups like Racemi and Cloudreach. However, Microsoft's Corey Sanders, head of product for Azure Compute, tells Business Insider that where other solutions bring over one server at a time, Azure Migrate brings over a whole bunch at once.

That's clutch, says Sanders, because modern applications are a hodgepodge of different servers, running different pieces of the application all at once.

"There are no applications that consist of a single server," says Sanders. 

Once the data is actually moved over, Sanders says that Microsoft provides tools for "rightsizing," or precisely managing your usage of the Microsoft Azure cloud. With proper rightsizing, says Sanders, some customers are saving 84% versus the costs of keeping their VMware infrastructure running in their data center.

And if a customer isn't ready to fully use the Azure cloud, Microsoft is also unveiling a service where, essentially, Microsoft will run their VMware infrastructure in its own data centers. It's not formally a part of the Azure cloud, meaning you lose out on some of the scalability. But it means the customer doesn't have to keep data centers.

For Microsoft, the strategic imperative behind Azure Migrate is multifaceted. First, there's the obvious: Making it easy for companies to go up to the Azure cloud could result in more companies using the Azure cloud. Notably, VMware itself wasn't consulted to build this tool, says Sanders, meaning this isn't an official partnership.

The other thing, though, is that Microsoft is a big fan of the so-called "hybrid cloud," whereby companies keep some of their software and data in Azure, and some of their software and data in their own servers. Sanders points out that Azure Migrate lets companies bring over some, but not necessarily all, of their software — meaning that, down the line and at their own pace, they could adopt that hybrid cloud model.

As for the future, Sanders says that Azure Migrate won't necessarily stay a VMware-specific tool, and that the company is listening to customer demand. The overall goal, says Sanders, is to make sure that Azure stays as business-friendly as possible.

"Azure is the choice for enterprises," says Sanders.



CET Event (Customer Engagement Technology) | 2017

Rounak Computers LLC partnered with CRAYON MIDDLE EAST participated in the 12th CET Event (Customer Engagement Technology)  happened at Dubai in Nov 22nd 2017 on behalf of Microsoft.


CET Event happens every year and engages company’s CEO’s and CFO’s to interact with latest technology Service Providers available in the Region.


Rounak Computers LLC explained the Participants about various products available in MICROSOFT CLOUD SOLUTIONS such as O365, AZURE and Dynamics 365





Gear up for more cyber-attacks in 2018

Cryptocurrencies allow cybercriminals to obfuscate ‘clean’ funds with dirty money

Ransomware continues to dominate the cybersecurity landscape in 2017 and will continue to pose a major threat to enterprises and individuals around the globe next year as the method continues to prove profitable and offers virtual untraceability for cybercriminals, industry experts said.

In 2017, 26.2 per cent of ransomware targets were business users — up from 22.6 per cent in 2016. This increase is due in large part to three major sophisticated attacks — WannaCry in May, ExPetr in June and BadRabbit in October.

Mahmoud Mounir, regional director at Secureworks, said that ransomware provides a 1:1 relationship with the victim, requiring no overhead for production of web-injects, managing money-mules, or cashout — with cryptocurrencies such as Bitcoin allowing cybercriminals to obfuscate ‘clean’ funds with dirty money through services like tumbling, mixing and coin laundering.

He said that targeted ransomware attacks on enterprises are also likely to be on the rise, as companies have the capital to pay higher ransoms than individuals. Criminals will continue to become more sophisticated, better resourced, and more patient, and will look to target businesses with higher value ransoms.

According to Kaspersky Lab report, 65 per cent of businesses were hit by ransomware in 2017. There was a marked decline in new families of ransomware: 38 in 2017, down from 62 in 2016, with a corresponding increase in modifications to existing ransomware (over 96,000 new modifications detected in 2017, compared to 54,000 in 2016). The rise in modifications may reflect attempts by attackers to obfuscate their ransomware as security solutions get better at detecting them.

Kaspersky Lab predicts a rise in cryptocurrency mining or targeted attacks for the purpose of installing miners, which can result in more money for criminals over time.

Mounir sees targeted attacks on banks will likely remain a threat, especially as organised criminal organisations engage in online banking fraud as one of means of generating income.

“Some organisations will focus on non-European and US banks, which are perceived to have weaker security controls and less robust business processes than most of the major Western banks,” he said.

However, he added that malware targeting is diverse and not limited to major banks. Wealth management companies and their high-net-worth customers will also be targeted, as are payroll processing portals.

Alastair Paterson, CEO and Co-Founder at Digital Shadows, said that the cybercriminal community is all about profit and that means they continue to utilise the same sorts of tactics if they continue to gain the results they are after — mainly money!

“But whatever happens in 2018 and beyond, what is clear is that cybercrime will continue to be a problem and present governments, businesses and individuals with challenges to protect their data and their intellectual property,” he said.

It is therefore critical that users take steps to manage their digital footprint and manage the digital risk they present to the World via your business activities in the internet and via cloud solutions. That way, he said that when something bad does happen, users will know quickly and can deal with it more effectively.

“I expect malware modified with self-replicating capabilities to continue in 2018, particularly given the disruption caused by WannaCry and NotPetya inspiring similar attacks,” he said.

The bar for cyber-attacks keeps getting lower, he said and added that the availability of leaked tools from the NSA and HackingTeam, coupled with ‘how to’ manuals, means that threat actors will have access to powerful tools that they can iterate from and leverage to aggressively accomplish their goals.

Predictions for 2018

• Business email compromise (BEC) and Business email spoofing (BES) attacks will also continue in 2018. This is where threat actors profit from sending emails to employees who have access to company funds, and from compromising the computer, email account, or email server of the victim organisation in order to intercept and alter, or initiate business transactions.

• Targeted attacks on banks will likely remain a threat, especially as organised criminal organisations engage in online banking fraud as one of means of generating income. Some organisations will focus on non-European and US banks.

• The dependability on AI/machine learning in cybersecurity will continue as more cybersecurity professionals and companies understand the benefits of an AI/machine learning in the way of streamlining and enhancing threat detection and response, especially when coupled with human threat analysis.

• internet of Things vulnerabilities will also be increasingly targeted by criminals, especially as the IoT network is fast expanding its user base with the likes of smart home assistants, smart cars, and all smart ‘things’.

• The shortage of skilled cybersecurity workers will continue.

• Cloud security will become a greater priority for businesses, as more companies move their data to the cloud. So, there will be an increased need for cloud security consulting, especially in light of the upcoming GDPR regulation.


• The imminent arrival of the General Data Protection Regulation (GDPR) and its subsequent effects will be largely felt across the industry, with those organisations not protecting data and staying compliant with security regulations exposed and fined up to €10 million or two per cent of worldwide annual turnover.

Amazon Web Services and Azure win higher federal security ratings to deal with national-security data

The public-cloud services offered by both Amazon and Microsoft have received new, higher levels of federal authorization to deal with sensitive data.

Microsoft’s Azure Government got a “provisional authorization” for DoD Impact Level 5 from the Defense Information Systems Agency (DISA), Microsoft said in a blog post today. The authorization will let Defense Department-affiliated organizations plan, assess, and authorize workloads involving unclassified national-security data.

The federal government has six levels of security for cloud data. Level 5 is second from the highest. Level 6 involves information classified as Secret.

Compliance with impact levels is supervised by DISA, which provides IT and communication support to the top members of the executive branch and to the military.

Microsoft has made a special effort to accommodate Azure to the federal government. It operates two logically and geographically distinct Azure Governmentregions (pairs or groups of data centers) exclusively for use by federal, state, or local governments. It also counts the U.S. Government among its biggest software customers, just last month signing a $927 million contract to provide technical support to DISA.

“Government wants to embrace the cloud, and we’re leading the way with that,” said Jason Zander, Azure’s corporate VP, in an interview. “We believe we have the most complete solution, with Azure, Office 365 and Dynamics 365 specifically designed for government. Office 365 is also certified at Level 5, and Dynamics 365 Level 5 certification is “in progress,” he said.

About 7,000 agencies at the federal, state and local level use one or more of those three government-cloud offerings, Zander said.

For its part, Amazon Web Services’ CloudWatch Logs — a service to monitor, store, and access log files from Amazon Elastic Compute Cloud (EC2) instances and other sources — has received provisional authority to operate at the FedRAMP High baseline within the AWS specially dedicated GovCloud (U.S.) region. This authority lets government customers use CloudWatch Logs to process the government’s most sensitive unclassified data.

GovCloud (US) holds provisional authorizations at Impact Levels 2 and 4 but not 5.

Full details on federal cloud security can be found here. A shorter explanation is here.

Amazon Web Services is the most popular provider of computing and storage services over the internet. Azure is number two by most measures.

Feds certify Amazon and Microsoft clouds to handle sensitive government data

Amazon Web Services and Microsoft both said today that parts of their respective cloud offerings have won federal certification as being secure, allowing the government to use them for sensitive patient records, financial data, law-enforcement data and other controlled but unclassified information.

The AWS GovCloud (US), an isolated portion of Amazon’s cloud launched in 2011 and designed to host sensitive workloads, got a provisional authority to operate from the federal Joint Authorization Board under the newly created Federal Risk and Authorization Management Program (FedRAMP) high baseline, AWS said. That baseline is a standardized set of more than 400 security requirements based on controls outlined by the National Institute of Standards and Technology. Data is classified as “high” if its compromise would severely affect an organization’s operations, assets or individuals.

azure logo“We’re excited . . . to recognize AWS as having achieved the most rigorous FedRAMP level to date,” said Matthew Goodrich, FedRAMP director, in a prepared statement. Meeting the baseline gives agencies “a simplified path to moving their highly sensitive workloads to AWS,” said Teresa Carlson, vice president of AWS’s worldwide public sector, in the same statement.  More than 2,300 government customers worldwide are already using AWS Cloud, and this certification can extend their uses, she said.

The AWS GovCloud (US) Region offers services including Elastic Cloud Compute, Virtual Private Cloud,  Simple Storage Service, Identity and Access Management and Elastic Block Store, AWS said. In addition to FedRAMP, it adheres to U.S. International Traffic in Arms Regulations (ITAR) and Criminal Justice Information Services requirements, as well as Levels 2 and 4 for DoD systems.

Microsoft’s Azure Government won the same FedRAMP provisional authority, which Goodrich in a statement called “a testament to Microsoft’s ability to meet the government’s rigorous security requirements.” The company successfully completeda FedRAMP high pilot in March.

Azure has also won provisional authorization to deal with Level 4 DoD data and with ITAR, Microsoft said. Details on its secure cloud are available here.

SAP to Offer Its Business Apps on Google Cloud

Germany’s SAP is teaming up with Silicon Valley giant Google to allow customers to run SAP’s big business applications on Google’s cloud while offering Google’s suite of web-based desktop apps to users, the company said on Wednesday.

Appearing on stage at Google’s Cloud Next conference in California, Bernd Leukert, SAP’s executive board member in charge of products and innovation, is set to announce the two companies are also working on joint machine learning initiatives to be unveiled at SAP’s own user conference in May.

SAP has moved in recent years to encourage the multinational base of corporate customers using its financial planning and other business applications to switch from traditional packaged software running on clients’ own computers to cloud delivery.

SAP, Europe’s largest technology company, said its flagship HANA database software was now running on the Google Cloud Platform (GCP) in order for customers to uncover real-time insights using big data from their operations on a grand scale.

The pact will allow customers to run SAP’s powerful database from laptops and other memory-constrained computers using streamlined HANA express edition software, while off-loading more complex tasks to Google’s cloud delivery platform. SAP also said it was working over the next two months to make its own cloud platform ready to run on the Google cloud, allowing developers to take advantage of its containerisation features that allow technicians to automate software updates.

SAP also plans to offer Google’s G Suite of business productivity apps including Gmail and Google Calendar to its own base of customers of more than 345,000 companies, which includes nearly 90 percent of the world’s 2,000 biggest firms.

This business collaboration reflects a bid by major internet companies such as Google and Apple to move into business software markets where SAP is a powerhouse in enabling companies to operate on the emerging industrial internet.

Separately, SAP agreed last year with Apple to allow its 2.5 million corporate developers to build SAP apps that run on iPhones and iPad tablets. Toward that end, they plan to launch a software development kit for programmers later this month.

Microsoft’s new Linux option for Azure is Clear in the cloud

Microsoft’s new Linux option for Azure is Clear in the cloud

Microsoft announced today that it has added support for the Intel-backed Clear Linux distribution in instances for its Azure public cloud platform.

Microsoft announced today that it has added support for the Intel-backed Clear Linux distribution in instances for its Azure public cloud platform.

It’s the latest in a lengthy string of Linux distributions to become available on the company’s Azure cloud. Microsoft already supports CentOS, CoreOS, Debian, Oracle Linux, Red Hat Enterprise Linux, SUSE Enterprise Linux, OpenSUSE and Ubuntu in Azure instances.

The new distro is available in three versions from Microsoft – first, in a stripped-down, simple VM designed for maximum customizability, second, in a Docker-based container runtime, and, finally, in a “sample solution image” designed for machine learning applications, to demonstrate some of the possibilities.

Clear Linux is a lightweight Linux distribution designed to be as high-performing as possible for server and cloud use – it’s the brainchild of Intel, which is positioning it as a key building block for containerized applications in particular and the cloud in general. It features a sophisticated workload scheduler, optimizations to the kernel and major Linux components like systemd and stateless operation.

Stateless is a big deal, according to Microsoft open source product manager Jose Miguel Parrella, particularly for teams operating in a DevOps environment.

“By separating the system defaults and distribution best practices from the user configuration, Clear Linux simplifies maintenance and deployment which becomes very important as infrastructure scales,” he said in a statement.

Microsoft’s embrace of Linux as a technology of the future, particularly where the cloud is concerned, has been well-documented. The company, which joined the Linux Foundation in November, says that fully a third of all virtual machines running on Azure are Linux.


Blog Calendar

February 2018
« Jan 2018
1 2 3
4 5 6 7 8 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28