A couple weeks ago, my colleague, Brandi Dollar, wrote a blog post about SQL Server transaction log basics. Her post is a great lead-in to a script that I wrote to solve a common problem, high VLF counts. An important piece of managing your database transaction logs is keeping the number of virtual partitions within the log file, the Virtual Log Files (VLF for short), low. A high VLF count is typically a result of running with the default auto grow settings. As the transaction file continues to grow at sub-optimal levels, the fragmentation will become worse and worse. High VLF counts can have an impact on several performance issues.Fixing a database with a high VLF count is a simple process:
- Check the current size of the transaction log.
- Backup the transaction log.
- Shrink the transaction log to as close to 0 KB as possible.
- Check that the VLF count is less than 50 (best if less than 10 at this point).
a. If the VLF count is greater than 50, repeat steps 1 through 3.
b. You may also have to check that there are no active transactions, etc. The log_reuse_wait_desc column in sys.databases will help identify why the log file cannot be shrunk.
- Grow the transaction log back out to the original size
When I log into a new server environment for a client, it is pretty common to see multiple databases that have high VLF counts. The steps above are easy enough to complete when you have one or two databases that need fixed, but it becomes a very tedious process when there are 20, 30, 40 databases that all need fixed.
To save a lot of the leg work involved, I wrote a script to generate all of the necessary commands. When you run the script below, it will print out all of the commands necessary to reduce the VLF count. Take the output from the messages tab, and you can run this result in phases in a new query window. I probably could have taken the script step a step further to completely automate the process; however, I still like to have control over the process and monitor the completion. Also, rather than writing my own transaction log backups, I will typically run an existing maintenance plan job as needed.
The output of the scripts below will come out similar to:
/**************************************** * Fix High VLF for database: MattWorks * Starting VLF count: 219 ****************************************/ -- Step 1: Get current log file size use [MattWorks] SELECT name AS [File Name] , physical_name AS [Physical Name], size/128.0 AS [Total Size in MB],size/128.0 - CAST(FILEPROPERTY(name, 'SpaceUsed') AS int)/128.0 AS [Available Space In MB], type_desc FROM sys.database_files; -- Step 2: Take transaction log backup if needed, and then shrink the log file DBCC SHRINKFILE (N'MattWorks_log' , 0, TRUNCATEONLY); -- Step 3: Check if current VLF count is less than 50. -- If not, take another log backup and repeat step #2 until VLF count is less than 50. check for active transactions, etc. DBCC LogInfo; --Step 4: Grow log file back to original size USE [master]; GO ALTER DATABASE [MattWorks] MODIFY FILE (NAME = N'MattWorks_log', SIZE = 1024MB); GO
I hope that this script can be of use to someone.
/************************************** * Script to generate commands to reduce transaction logs * with hig VLF counts. * written by Matt Nelson, RDX ***************************************/ SET NOCOUNT ON --variables to hold each iteration declare @query varchar(100) declare @dbname sysname declare @vlfs int --table variable used to loop over databases declare @databases table (dbname sysname) insert into @databases --only choose online databases select name from sys.databases where state = 0 --table variable to hold results declare @vlfcounts table (dbname sysname, vlfcount int) --table variable to capture DBCC loginfo output declare @dbccloginfo table ( --resourceid int, --need to add this column for SQL 2012 fileid tinyint, file_size bigint, start_offset bigint, fseqno int, [status] tinyint, parity tinyint, create_lsn numeric(25,0) ) while exists(select top 1 dbname from @databases) begin set @dbname = (select top 1 dbname from @databases) set @query = 'dbcc loginfo (' + '''' + @dbname + ''') WITH NO_INFOMSGS' insert into @dbccloginfo exec (@query) set @vlfs = @@rowcount insert @vlfcounts values(@dbname, @vlfs) delete from @databases where dbname = @dbname end /* Build data file info */ CREATE TABLE #DatabaseFiles ( [database_name] [sysname] NOT NULL , [file_id] [int] NOT NULL , [file_guid] [uniqueidentifier] NULL , [type] [tinyint] NOT NULL , [type_desc] [nvarchar](60) NULL , [data_space_id] [int] NOT NULL , [name] [sysname] NOT NULL , [physical_name] [nvarchar](260) NOT NULL , [state] [tinyint] NULL , [state_desc] [nvarchar](60) NULL , [size] [int] NOT NULL , [max_size] [int] NOT NULL , [growth] [int] NOT NULL , [is_media_read_only] [bit] NOT NULL , [is_read_only] [bit] NOT NULL , [is_sparse] [bit] NOT NULL , [is_percent_growth] [bit] NOT NULL , [is_name_reserved] [bit] NOT NULL , [create_lsn] [numeric](25, 0) NULL , [drop_lsn] [numeric](25, 0) NULL , [read_only_lsn] [numeric](25, 0) NULL , [read_write_lsn] [numeric](25, 0) NULL , [differential_base_lsn] [numeric](25, 0) NULL , [differential_base_guid] [uniqueidentifier] NULL , [differential_base_time] [datetime] NULL , [redo_start_lsn] [numeric](25, 0) NULL , [redo_start_fork_guid] [uniqueidentifier] NULL , [redo_target_lsn] [numeric](25, 0) NULL , [redo_target_fork_guid] [uniqueidentifier] NULL , [backup_lsn] [numeric](25, 0) NULL ) EXEC dbo.sp_MSforeachdb 'INSERT INTO #DatabaseFiles SELECT ''[?]'' AS database_name, * FROM [?].sys.database_files' print '' print '/*********************************************************************************************' print 'Copy results below this line to new query window to execute as needed.' print '*********************************************************************************************/' DECLARE @databasename varchar(max) DECLARE @vlfcount varchar(max) DECLARE @filename varchar(max) DECLARE @physicalname varchar(max) DECLARE @totalsize varchar(max) DECLARE @availablespace varchar(max) DECLARE @typedesc varchar(max) DECLARE dbcrsr CURSOR FOR SELECT dbname, vlfcount FROM @vlfcounts WHERE vlfcount >50 --comment this line to print statements for all databases. AND dbname not in ('tempdb','master','msdb','model','reportservertempdb') ORDER BY dbname OPEN dbcrsr FETCH NEXT FROM dbcrsr INTO @databasename,@vlfcount WHILE @@fetch_status = 0 BEGIN DECLARE filecrsr CURSOR FOR SELECT name, size/128 From #DatabaseFiles Where database_name = '['+@databasename + ']' and type_desc='LOG' OPEN filecrsr FETCH NEXT FROM filecrsr INTO @physicalname, @totalsize --Generate print statements to execute in new window WHILE @@fetch_status = 0 BEGIN if @totalsize=0 SET @totalsize=1 print char(10) + '/****************************************' print '* Fix High VLF for database: ' + @databasename print '* Starting VLF count: ' + @vlfcount print '****************************************/' print '-- Step 1: Get current log file size' PRINT 'use [' + @databasename + ']' print 'SELECT name AS [File Name] , physical_name AS [Physical Name], size/128.0 AS [Total Size in MB],size/128.0 - CAST(FILEPROPERTY(name, ''SpaceUsed'') AS int)/128.0 AS [Available Space In MB], type_desc ' print 'FROM sys.database_files;' print char(10) + '-- Step 2: Take transaction log backup if needed, and then shrink the log file' + char(10) + 'DBCC SHRINKFILE (N''' + @physicalname + ''' , 0, TRUNCATEONLY);' print char(10) +'-- Step 3: Check if current VLF count is less than 50.' print '-- If not, take another log backup and repeat step #2 until VLF count is less than 50. check for active transactions, etc.' print 'DBCC LogInfo;' print char(10)+'--Step 4: Grow log file back to original size' print 'USE [master];' + char(10) + 'GO' print 'ALTER DATABASE [' + @databasename + '] MODIFY FILE (NAME = N''' + @physicalname + ''', SIZE = ' + @totalsize + 'MB);' + char(10) + 'GO' print +char(10)+char(10)+char(10) FETCH NEXT FROM filecrsr INTO @physicalname, @totalsize END CLOSE filecrsr DEALLOCATE filecrsr FETCH NEXT FROM dbcrsr INTO @databasename,@vlfcount END CLOSE dbcrsr DEALLOCATE dbcrsr DROP TABLE #DatabaseFiles
Thank you for reading! Stay tuned for my next post in the coming weeks.
Though it generally falls under the umbrella of the IT industry, cloud computing could be considered to be in a league of its own. Database experts are beginning to favor the scalable, flexible deployment option, as it allows them to efficiently accumulate and manage digital intelligence. IT companies are recognizing the popularity of this trend, and in turn helping their clients prepare for or execute cloud migration endeavors.
According to Chris Kanaracus, a contributor to Good Gear Guide, Oracle recently announced that it will allow its customers to transition their subscription plans for on-premise software programs to the company's Software-as-a-Service applications. The migration plan, appropriately named Customer 2 Cloud, enables consumers currently using Oracle's in-house human capital management and customer relationship management tools to switch over to cloud versions.
Many Oracle experts have stated that the business move is likely in response to increased pressure from competitors such as Workday and Salesforce.com. While the latter organization provides its customers with cloud-based CRM systems, the former constructs HCM and enterprise resource planning solutions that operate through SaaS models. Ray Wang, founder and chairman of Constellation Research, noted that many enterprises desire such cloud applications, but are often unsure of how to make the transition.
"Many customers need a defined methodology to go from on-premises to cloud," said Wang, as quoted by Kanaracus.
Oracle: a cloud game changer?
Oracle's Customer 2 Cloud program isn't the only solution the company has recently released that concerns cloud computing. In April, the corporation also released Solaris 11.2, a Unix-based operating system that provides businesses with a viable cloud management suite. ZDNet contributor Larry Dignan acknowledged Oracle CEO Mark Hurd's ambitious goal to spearhead new solutions and applications that will solidify the company's place as the leader of everything cloud. While such an endeavor may come off as a tall order, database administration service organizations have taken note of the latest Oracle projects.
For example, Dignan wrote that Oracle's Marketing Cloud can break down data channels and correlate them with IT and marketing features that can help businesses appeal to customers who exercise some kind of sway over other consumers. The program includes Oracle BlueKai, Oracle Content Marketing, Oracle Eloqua, Oracle Responsys and Oracle Social Cloud, ensuring that an integrated, concise solution will be at the disposal of CMOs throughout the world.
From what can be gathered of these recent initiatives, it's apparent that Oracle intends to assert itself as a serious competitor in the cloud application sector.
There are two IT aspects enterprises are looking for in regard to database administration: performance and data analytics. Aggregating large troves of digital information is becoming less of a challenge now than it was four years ago. Now, companies are more concerned with translating the data into actionable insight that can drive business transformation and agility.
Giving it a boost
One particular program that database support services have favored is Microsoft's SQL Server 2014, which has been lauded for its data analytics properties and processing capabilities. EnterpriseTech reported that storage and application performance enhancement developer Fusion-io is leveraging flash functions to accelerate the data warehouse management solution's core functions. Fusion-io President Lance Smith noted that SQL Server 2014 added two new ways in which Flash can be used to enhance database effectiveness.
- Use Flash to accelerate the old server engine using Buffer Pool Extension, a SQL Server 2014 feature that allows files within databases to be retrieved faster, without using many resources.
- Leverage flash to keep Hekaton (the name given to Microsoft's in-memory online transaction processing database) running better in tandem with the tables it stores and processes inside of main memory.
In a nutshell, Fusion-io's Flash utilization reduces the amount of sequential disk input/output and increases transaction log activity – making it easier to run complex analytics programs.
A platform for scrutiny
Nowadays, database experts realize that making heads or tails of the digital information at an enterprise's disposal involves executing effective information analysis programs. According to CIO Today, SQL Server 2014 is capable of supporting an environment that experiences heavy data scrutiny traffic.
Microsoft CEO Satya Nadella noted that the corporation sought to construct a platform possessing "ambient intelligence," or, an atmosphere that is both sensitive and responsive to an individual's presence. As can be imagined, creating such a system was no easy task.
"It requires technology to understand context, derive intent and separate signal from noise," said Nadella, as quoted by the source. "Building out a comprehensive platform that can enable this kind of ambient intelligence is a whole company initiative that we are uniquely qualified to undertake."
Ultimately, the platform, which as been described as "big data in a box" by some professionals, includes Internet of Things capabilities and delivers the building blocks SQL Server 2014 users require to connect, refine and analyze their data. It seeks to provide answers to those who have collected an incredible amount of digital information from various sources.
The United States economy has witnessed a general upswing since March. After the end of a harsh winter that slowed market progression, people are getting back to work and consumers are hitting the stores once again.
Increased fiscal activity means new expansion opportunities for enterprises. In the modern era, business growth is typically associated with enhancing big data capabilities. In a highly technical atmosphere, executives are considering the benefits associated with outsourcing their IT responsibilities to remote database support companies.
According to Forbes, The Bureau of Labor Statistics recently released its April employment report, which showed that 288,000 jobs were added in April – 70,000 more than expert economists initially predicted. The unemployment rate dropped from 6.7 percent to 6.3 percent, the lowest rate since before the onset of the 2008 financial crises.
However, statistical representation can often be misleading. Dan North, chief economist at Euler Hermes North America, told the source that the unemployment rate fell so much over a short period of time because more people left the United States. Nevertheless, North noted that the feeling of optimism hasn't dampened corporate earnings and that the marketplace is growing faster than anticipated.
Where has all the labor gone?
While the Federal Reserve tries to cope a quickly growing economy, database administration companies are looking for human assets to fill in key positions. Nonprofit organization CompTIA recently surveyed 305 IT firms across the United States for its quarterly IT Industry Business Confidence Index. Despite the fact that these enterprises expressed the desire to expand, many of them are having difficulty finding qualified personnel to fill new positions.
The index showed that 48 percent of the job opportunities offered by respondents consist of general support or service duties. Application development positions stood at 39 percent and cloud experts rang in at 33 percent. In addition, 29 percent of the open positions were for network engineers and 28 percent presented opportunities for professionals with security expertise.
Despite the lack of labor that seems to be plaguing the industry, it takes CIOs considerably less time to fill these positions (23 business days) than it does in other sectors. However, many IT professionals surveying candidates are quite picky about their choices – about 80 percent of respondents found recruiting for technical hires "challenging."
What this creates is an incredibly stressful work environment for database experts who are already on staff. It's not uncommon for those working for a managed IT services company to be on call constantly, sometimes putting in 70 hours in a work week. CompTIA's evidence clearly points to a need for more education in computer sciences.
Public sector officials and experts alike have largely celebrated the potential innovation and efficiency-driving power of the massive transition into the digital era, as data management outperforms traditional information governance in a variety of ways. In the health care sector, many organizations are still working to establish a stronger foothold in modernized data management practices because of the novelty of most systems.
Health Data Management recently listed several reasons why many businesses in this sector have issues with maintaining the best practices of data encryption, specifically regarding mobile device utilization. Citing security breaches that have taken place and impacted millions of individuals throughout the past several years, the news provider pointed out that health care providers are largely overlooking the necessary actions that must be taken to protect patient data that is being accessed by smartphones and tablets.
Encryption, when managed and monitored properly, can significantly improve the level of control an organization has over its data, regardless of whether it is in a storage environment or in transit at any given time. According to the source, the most common reasons behind a widespread lack of device, application and data encryption in this sector continue to be related to in-house expertise shortages and a lack of leadership acumen.
Like many other sectors in the modern market, the health care industry has had to tackle countless trends that have emerged in a relatively short period of time, leading to extraneous complexity and a generally uphill battle.
However, support is available for all types of organizations, especially when it comes to data management and security procedures. The trick is to know when those assistive frameworks are necessary, and where to look during the provisioning process. By proactively identifying data management needs, be it for security, accessibility or disaster recovery-related purposes, medical firms will be able to remain ahead of the curve.
Oracle experts working for companies using singular or multiple cloud platforms are anticipating a new update released by the database solutions developer. Cloud computing has grown more popular with time, making it only appropriate that one of the world's most renowned IT enterprises would develop a product that makes working with cloud servers much easier.
According to InfoWorld, Oracle recently introduced a Solaris update that includes a share of the OpenStack cloud management solution. It is believed that the system will make it easier for Oracle customers to operate mixed cloud environments. Released as a public beta on April 29, Solaris 11.2 possesses OpenStack components such as the Horizon dashboard, Nova compute module, Neutron network platform and Cinder block storage structure.
For those who aren't aware of what OpenStack brings to the table, the news source explained that the solution can be used to implement on-demand compute services on virtualized equipment as well as private cloud deployments.
Moving toward an integrated solution
Cliff Saran, a contributor to Computer Weekly, stated that Solaris 11.2 operating off of Oracle's aggressively-priced Sparc servers makes for a more cost-effective solution, though it's debatable whether or not the combination will surpass Linux-based x86 servers held in a data center environment. Database administration professionals have claimed that Oracle's recent move is likely a push for an integrated system.
"It is obvious that Oracle has delivered on its commitments regarding Sparc and is continuing its investments in Sparc CPU and system design as well as its Solaris operating system technology," said Richard Fichera, research director for Forrester, as quoted by the source.
In order to remain competitive with Linux Servers, Fichera further noted that the new Sparc deployments will feature a multi-year, reliable performance progression and system scalability.
Data infiltration attempts are becoming just as much a part of the criminal underworld's practices as money laundering, drug trafficking and extortion.
Though it may be strange to compare a computer nerd with a notorious mobster, many database experts would suggest that the damage a well-rounded cybercriminal can cause exceeds the capabilities of a ruthless gang confined to the streets of New York City. Once an individual harboring ill intentions gains access to a company's servers, they can potentially steal thousands of data points representing business finances and consumer information.
Understanding what they're up against
Paul Kleinschnitz, general manager of First Data's cybersecurity solutions department, told Bank Info Security that many organizations lack a full comprehension of what they're up against. Although stealing intelligence through a computer seems more benign than pointing a gun at a person's head to obtain it, that doesn't make cybercriminals any less relentless or malevolent. In addition, the reach of such figures is much more expansive than traditional underworlds.
"One of the inhibitors to adoption in this space, in my opinion, has been a lack of knowledge of how these technologies work together, rather than compete," Kleinschnitz told the source.
Essentially, a comprehensive understanding of which tactics cybercriminals employ will give companies a better perception of how to deter attacks. Some executives hire database administration services well versed in data infiltration techniques to provide oversight of the criminal element's practices.
Employing tactics of their own
After comprehension of the environment working against them has been established, corporations should begin implementing cybersecurity practices and techniques that will deter infiltration attempts by hackers. Jonathan Hassell, a contributor to CIO, claimed that such attacks are going to happen no matter what enterprises do, but outlined four of the best practices they can employ, regardless:
- As customers often fall victim to cyberattacks, it's important that they are notified whenever a breach occurs. This will allow them to take any steps necessary to protect their critical information.
- Ultimately, malevolent figures operating on an online platform look for intelligence pertaining to personal finances and identity-based data. Therefore, businesses should protect the servers on which such knowledge is held.
- Lobby for legislation that supplements meager IT budgets so that investment in security software can occur.
- When an attack occurs, ask for assistance from database support services and other IT professionals.
Though the down payments associated with protection software may seem intimidating, it's important for executives to recognize that the costs pertaining to a well-orchestrated data breach could cost them millions of dollars.
Between private, hybrid and public cloud deployments, enterprises throughout the United States are concluding that adoption of one of these three services is worth "the risk." Though the idea of transferring all or most of their data onto a remote server seems daunting, a number of database administration professionals have claimed that the cloud possesses no more risks than conventional systems, just different ones.
Moving past the exploratory stage
Several years ago, the public cloud was primarily regarded as an experimental part of the IT industry, with businesses engaging with the technology in a limited manner. However, it appears that the atmosphere has transformed significantly, as many IT professionals consider Software-as-a-Service to be an integral part of the computing market.
According to a study conducted by Forrester, the cloud services market is predicted to be worth $191 billion by 2020. 2013 yielded $58 billion in revenue related to the technology, following a positive trend characteristic of the past three years.
In general, public cloud technology is witnessing increased adoption as time progresses. Whereas SaaS solutions stood at $4.7 billion at the beginning of last year, 2013 ended with a cumulative $36 billion profit. Database experts and CIOs surveyed by Forrester noted the likelihood of SaaS eroding traditional infrastructures, with new cloud-friendly applications spearheading this transition.
Different approaches for a new operation model
The assumption that the public cloud is less secure than traditional systems is slowly fading from existence. CIO reported that while attacks on cloud environments have increased in frequency and intensity, professionals such as Stephen Coty, chief security evangelist at Alert Logic, note that SaaS is not inherently more vulnerable than conventional on-premise platforms.
"Cloud deployments are no less secure than your own data centers," said Coty, as quoted by the source.
Alert Logic's Spring 2014 Cloud Security Report scrutinized 232,364 verified security incidents that were diagnosed between April 1 and September 30, 2013, finding that brute force attacks and malware/botnet infiltration attempts are now being sustained by cloud environments just as much as they are by on-premise systems. Coty noted that the reason for this is likely because more enterprises are migrating from in-house data centers to cloud servers, forcing cybercriminals to shift their priorities.
"As more enterprise workloads have moved into the cloud and hosted infrastructures, some traditional on-premise threats have followed them," noted Coty.
The professional acknowledged that database support services will likely reassess their priorities and develop protective measures designed to defend cloud environments from malicious figures.
With all of the excitement surrounding the Internet of Things comes a number of database experts who remain skeptical. It's not as if they're opposed to the idea of billions of devices connected to the Internet, it's whether or not companies will be able to connect with that digital intelligence and process it through their networks and analytics programs.
CIO contributor J.D. Sartain wrote that compatibility is the number one concern of many organizations engaging with big data. Standardization of all Internet-connected devices would provide an incredible amount of market insight to participants across multiple industries, but that would also mean that enterprises would lose a competitive edge within a particular sector, such as retail or infrastructure.
Sartain spoke with Maciej Kranz, vice president of the corporate technology group at Cisco, who claimed that security worries are causing database administrator services to remain cautious of jumping into the IoT pool. For example, smart cars containing Ethernet technology leave their operating systems vulnerable to hacks, malware and denial-of-service attacks. In addition, preventative maintenance needs to be employed for devices, analysis applications and management programs that operate in real-time, as their disruption could compromise an enterprise's operation.
"If an average oil rig generates 5TB of data per minute, how can vendors and corporations ensure that critical data is communicated to the appropriate device if, say, the temperature exceeds a critical threshold?" wrote Sartain, referencing Kranz's deductions.
Start with the basics
For corporations worried about optimal connectivity, simplicity might be their best bet. According to MIT Technology Review, New York City-based organization Bug Labs developed Freeboard, which provides people with a one-click method to publish data from a device to the company's webpage, Dweet.io. With the service, individuals possessing no prior experience with data analytics can receive diagnostic reports based upon the raw data produced by the device they use to access the site. GPS coordinates, PC mouse position and other observations are made available to the public, and another click will stop the sharing if the visitor so wishes.
"We are trying to make the Internet of Things far simpler, and far more accessible to anybody," said Bug Labs CEO Peter Semmelhack, as quoted by the source.
Because of its easy-to-use platform, Freeboard has gained traction with professionals such as Venkatesh Prasad, group and technical leader for vehicle design and infotronics at Ford Motor Company. Because the program translates data from nearly every IoT-related mechanism, it's likely that database administration specialists will see it as a cost-efficient avenue to process their raw information.
Here at RDX, our internal and external clients can have varying levels of SQL Server knowledge. Explaining technical details to clients is, in my opinion, one of the most rewarding tasks we perform. Not only does this allow us to provide client-specific recommendations, it allows our relationships to evolve from ‘support personnel’ to a trusted team member. In my second blog post, I am addressing one of the topics that come up often – SQL Server transaction logs.Transaction Logs and Their Internal Structure
Each database within SQL Server has at least one transaction log file. The default file extension for these is “LDF”. The transaction log files are internally split into virtual log files (VLFs). VLFs are created when a transaction log file is created and when the transaction log file grows. With improper autogrowth settings or within large transaction log files, the number of VLFs can grow to the point that performance degrades. You can identify the number of VLFs in a database by running this command and noting the number of rows returned:
DBCC LogInfoRecovery Models and Their Effects
Some data is transient or can easily be rolled forward through your application. For these databases, the Simple recovery model can be used. Full and differential backups are the available backup types. Records in the transaction log file are cleared once the transaction completes. In other situations, you need to minimize data loss, also referred to as ‘point-in-time’ recovery. Using the Full recovery model in conjunction with full, transaction log, and, at times, differential backups allows you to meet this need. Records in the transaction log remain in the file until a transaction log backup is taken. Without transaction log backups, databases using the Full recovery model will experience exponential transaction log file growth. During large bulk operations, the bulk-logged recovery model can be used. This minimizes the amount of logging taking place in the transaction log and, therefore, your disk space requirements. Point-in-time recovery is not supported using this recovery model. If point-in-time recovery is required, transaction log backups should be taken immediately before changing the database to bulk-logged recovery and after the database is changed to use Full recovery.How Transactions are Tracked
Within the VLFs are records of all transactions that occur within its database, marked with log sequence numbers (LSNs). These LSNs are important for recovering your data to a point in time if a disaster or failure occurs. All backups begin with creating a header containing the LSN details. This information can be used to ensure you have all of the backups you need to recover. Restoring backups out of order, skipping a restore of a transaction log backup, and restoring only transaction log backups are not supported. If a recent transaction log backup is deleted, it will affect your ability to recover, and a new full backup should be taken as soon as possible. If you suspect a transaction log backup has been deleted, you can compare the header details below.
Transaction logs are a critical concept to master when working with SQL Server. Without proper attention they can impact the performance of your databases or affect your ability to recover from a disaster situation. Discussions between DBAs (‘accidental’ or otherwise) and application or business decision makers need to occur periodically to ensure that minimal data is lost outside of the SLA.
I hope this post has provided you with useful insight into your SQL Server transaction logs. In the coming weeks, Matt Nelson will be explaining how to manage the number of VLFs in your databases. Stay tuned!
Last week, Oracle introduced their Oracle Mobile Authenticator – a twist on dual-authentication security, in which a user’s smartphone actually doubles as an authentication device.
The move shouldn’t come as a surprise in an age when a person’s mobile device is more or less an extension of their identity. Mobile authentication simultaneously removes the cost and complexity of purchasing and maintaining authentication solutions like smart cards or tokens for companies, while giving their customers and workers the freedom to use their personal device.
Not only is it cost-effective – the idea is brilliant, and the timing couldn’t be more perfect, as digital security risk continues to rise and companies scramble to adopt dual-authentication strategies.
Now let’s see how many companies take advantage and jump on this gem while it’s young.
To the chagrin of Microsoft executives, a number of business professionals still using the now unsupported Windows XP operating system are considering switching to Linux, which offers free software spawned by the open source development process. Though not necessarily the most popular OS solution, database experts are now praising Linux as both operable and effective.
Just make the upgrade
InfoWorld contributor and avid Microsoft enthusiast Peter Bruzzese wrote that, although he understands why XP users are reluctant to move to Windows 8, there's little reason why they should avoid transitioning to Windows 7. For one thing, the user interfaces of the latter deployment and XP are virtually the same – it appears as if an XP user made a simple, rudimentary update. Essentially, there's already going to be a level of familiarity with Windows 7, and companies don't have to worry about employees getting acclimated to a new level of operability.
Although a number of database administration and IT professionals have lauded Linux for vastly improving its modern, signature OS, Bruzzese isn't convinced. For one thing, those used to working with software compatible with Windows deployments will have to switch to new deployments former XP users would be vastly unfamiliar with. This means that training in specific programs may be required – an investment a number of companies are unable to make.
Taking a chance?
Yet there are some professionals, such as San Francisco Gate contributor David Einstein, who believe there's untapped opportunity in Linux. Despite the fact that deployment runs many large enterprise and Web servers and is the basis for Android, Linux hasn't gained a lot of traction with those looking for a desktop OS, primarily due to two factors:
- Past models weren't as easy to use as Mac OS or Windows
- Not enough software was produced to run on it.
However, Einstein noted that a particular version of Linux is customized to replace both Windows XP and 7. Under the brand name Zorin, the OS mimics the user interfaces of the latter two solutions, contrasting Bruzzese's conclusion that XP users won't be able to adapt as easily to any Linux OS. Linux provides prospective users with the chance to download Zorin onto hard disks for free. Furthermore, the system offers user-friendly software such as a Microsoft-compatible office suite, a photo editing program and the Chrome browser.
Due to the liberal pricing and familiar formatting, there's a good chance enterprises will choose Zorin to replace their unprotected XP deployments.
Among a list of other reformations, the United States Affordable Care Act requires each state to construct a health insurance exchange to broker and manage care packages, measure eligibility, automate enrollment and electronically transfer patient information between participating entities. Oregon officials have relied on Oracle's database experts to build such a system, but may be considering a more consultative approach.
According to a report released by Oracle, Oregon's Department of Human Services perceived ACA mandates as an opportunity to upgrade its IT infrastructure and improve the organization's delivery of services. Before the health care legislation was signed, DHS was already in the process of upgrading programs that assisted impoverished families with dependent children, but was using an inadequate legacy system to determine citizen eligibility for Medicaid.
In addition, the federal legislature obligated the DHS to connect patients with coordinated care organizations – networks of physicians, dentists, mental health professionals and other treatment providers who have collaborated to support consumers who will receive benefits from insurance exchanges. It was evident that such an operation required the expertise of database administration professionals.
As a result, the DHS used the Oracle Enterprise Architecture Framework, as well as the IT developer's Architecture Development Process. Both of these elements provided a structure for updating the state's legacy infrastructure and ensured that any future developments would be adequately supported. Bob Ladouceur, information systems applications manager at Oregon DHS, claimed that the enterprise architecture enabled professionals to coordinate multiple activities within a versatile initiative to stay focused.
The result of the project was the development of a health insurance exchange website appropriately named Cover Oregon, which according to Oracle officials isn't fully operational. According to Statesman Journal, the state has blamed the database support services company for not meeting contract expectations even though Oracle stated that Cover Oregon would not be ready for launch until October 1. Despite warnings issued by Oracle President and Chief Financial Officer Safra Catz, the state launched the website's services several months before the scheduled release date.
Although Oregonian officials claimed that Oracle continued to reassure DHS that the website was complete, Oracle responded by noting that the state rushed the process. This level of contention has led DHS to sever its partnership with the database architecture company, a move that many IT professionals perceive as unwise. As opposed to resolving the issue, Oregon is searching for database administration services to help them proceed.
Complex networking, cloud computing and a list of other IT technologies have made data protection all the more difficult. The United States and Canada have implemented security compliance standards, but database experts are claiming that governments need to move past basic regulations and employ more intricate, thorough defense practices.
According to InfoWorld, Canadian authorities recently arrested 19-year-old Stephen Solis-Reyes for utilizing the Heartbleed bug to steal taxpayer information from the Canada Revenue Agency's website. CRA stated earlier this week that the vulnerability was leveraged to steal the Social Insurance Numbers of about 900 people, which caused the agency to temporarily prohibit citizens from filing online tax returns. The delinquent was charged with one count of unauthorized use of a computer and one count of mischief in relation to data.
The fact that a reprobate as young as Solis-Reyes infiltrated a government website shows just how dexterous modern cybercriminals have become. For this reason, many public authorities have outsourced to remote database support providers capable of devoting an incredible amount of manpower and resources to deterring network intrusions. As it becomes more difficult to physically steal financial information, more deviants will turn to the Web and other IT-based avenues to obtain confidential intelligence.
Moving past the basics
It's not easy to gain access to a government database, but cybercriminals are quite capable of adapting to an environment known for its versatility. CIO noted that U.S. federal CIOs claimed that making infiltration difficult poses a challenge – public employees still need to connect with digital information. Simon Szykman, CIO at the Department of Commerce, stated that an answer to this conundrum is automated surveillance, enabling networked assets to search for and report potential security breaches.
"We're now moving toward an area of much more automated and near real-time situational awareness where we have systems that themselves are able to verify that controls are being implemented," said Szykman, as quoted by CIO.
The National Oceanic and Atmospheric Administration told the news source that its database administration staff collects digital information from more than 20,000 devices. Due to automated monitoring, all of that data becomes centrally aggregated and analyzed, meaning that 1 billion events are processed every day. NOAA CIO Joe Klimavicz claimed that the organization blocks almost half a million malevolent Web connections each week.
Automated processing allows in-house IT departments and researchers to focus on developing new security techniques while maintaining a continuous overview of all network activities.
Between merchandisers obtaining data through e-commerce applications and industrial developers searching for ways to optimize critical infrastructure grid performance, database experts would agree that the complexity of the modern database has expanded. Professionals typically think of scalability when they refer to the changing environment, but it's more applicable to scrutinize the programming languages and analytics applications used by companies.
For IT professionals, using multiple languages to submit commands or evaluate digital information can be an arduous task. InfoWorld contributor Paul Venezia noted that computer technicians typically subscribe to a single form of communication, but wear out the orders and practices that are used the most. Using a different transaction method means they have to adjust to a new way of completing tasks.
The routine of using the same language can make professionals lose sight of the logic behind it. For example, if the same command were viewed from a different mode of communication, the person scrutinizing the order may realize that there was a faster, optimal way of executing it. Experienced database administration personnel are often multi-lingual, capable of understanding the subtext within particular directions given to the server. Due to the wide range of varying digital intelligence, it has become necessary for human IT assets to comprehend this variability.
According to InformationWeek, enterprise data warehouse company Teradata recently released a QueryGrid data-access layer capable of orchestrating multiple modes of analysis across several platforms, including those developed and issued by Oracle. Chris Twogood, vice president of product and services marketing for Teradata, noted that the program can automatically conduct analytics tasks without constant surveillance from human assets.
"Users don't care if information is sitting inside of a data warehouse or Hadoop and enterprises don't want a lot of data movement or data duplication," said Twogood, as quoted by the news source. "The QueryGrid gives them a transparent way to optimize the power of different technologies within a logical data warehouse."
As an example, Twogood cited a task database administration services would conduct for a retailer. Essentially, QueryGrid would enable merchandisers to find high-value customers on the Web, then input their comments on Facebook or Twitter into Hadoop. The program would then collect negative sentiments regarding the company using the program and correlate that data with the more favorable consumer in order to deduce how the individual can influence people against churning.
Although the intricacy of today's digital information is expanding, so are the software programs used to organize and analyze it all.
There's no question that the information disclosed by Edward Snowden regarding the United States National Security Agency's surveillance techniques has shaken the technological world. As a result, many domestic and foreign businesses are instructing their database administration providers to reevaluate their active cloud deployments.
NSA Aftershocks recently queried 1,000 C-suite professionals specializing in information and communications technology and cloud computing. The study showed that approximately 88 percent of CIOs are rethinking their cloud purchasing behavior, with 38 percent revising acquisition contracts previously established with vendors. Out of all the respondents, a mere 5 percent believe that it doesn't matter where enterprise data is stored.
In addition, the report discovered that the majority of corporations are solidifying contracts with cloud providers located domestically. A desire for closer proximity and a better overview of government legislation is driving this factor. Although corporations are unwilling to sacrifice cloud operability, executives are placing more emphasis on protection. About 84 percent of those surveyed by NSA Aftershocks reported that they are consulting with database experts to train in-house personnel in both rudimentary and advanced cloud security.
"The vast majority of respondents agreed that extra investment would go towards added security measures, training was also seen as a key priority," the study acknowledged.
A double-edged sword
Despite the fact that many enterprises are battening down the hatches, maneuverability cannot be abandoned. By allowing employees unrestricted access to data, corporations will be able to fluidly make key business decisions. However, as many workers choose to obtain company information via mobile devices, the surveillance responsibilities of database support services become ever more complicated.
Praveen Thakur, a contributor to online magazine The Nation, claimed that security professionals are executing in-depth, multi-layered approaches to data defense as opposed to employing conventional techniques that are largely ineffective in the face of complex communications technology. Instead of constructing bulwarks designed for the sole purpose of deterring threats, DBA services are developing protection methods that consistently prevent, detect and manage cyber threats.
Due to the fact that many enterprise employees use disparate applications and software to interact with digital intelligence, Thakur recommended outsourcing to IT professionals who can administer comprehensive protective measures. Collecting a variety of different security solutions to resolve separate issues can clutter an operation and actually do more harm then good by congesting system tools and features.
Many executives favor Microsoft products over competing software. Since its inception, the corporation has established itself as a developer of business-standard technology, with millions of subscribers distributed throughout the world. Due to recent improvements spearheaded by new CEO Satya Nadella, many organizations previously unfamiliar with the company's products are implementing Microsoft solutions with the help of database administration services.
Releasing a more affordable product
Pete Pachal, a contributor to technology blog Mashable, noted that Microsoft began selling Office 365 Personal earlier this week for $6.99 a month, accommodating subscribers with applications such as Word, Excel, PowerPoint and Outlook, among others. In contrast to the solution's counterpart, Office 365 Home, Personal only allows users to install the program on a single PC or Mac. However, the offer makes sense for enterprises working primarily with such machines.
Personal's integration with Microsoft's cloud solution, One Drive, enables employees to share, store and edit files seamlessly. As this process expedites business operations, senior-level management may consider Office 365 to be a viable option for satisfying the needs of their departments. For those looking to abandon products manufactured by Microsoft's competitors, however, the transition may be easier said than done.
Steps for migration
Moving a large variety of email into Office 365 may require the assistance of database administration professionals. According to InfoWorld contributor Peter Bruzzese, corporations need to consider what information should be transitioned into Outlook, where that data is stored and whether or not it will be manipulated after all digital intelligence is successfully relocated. In order to ensure a smooth transfer, Bruzzese recommended making the following considerations:
- Perform a preparatory review of all messaging needs and orchestrate a plan that will supplement those requirements.
- If a company is migrating from Exchange, database support services can it transfer all on-premise data into the cloud through Exchange Web Services, which allows users to export 400GB a day.
- Those relocating data from Google, Network File Systems or Notes should consider using Archive360, which can filter data through Exchange and then transfer it into Office 365.
- Companies transitioning email data from GroupWise could find solace in funneling the information through Mimecast and connecting the storage with Office 365 mailboxes.
Obviously, a command of certain programs is required, depending on what kind of route an organization chooses. For this reason, consulting database experts may be the best option.
Due to the prevalence of omnichannel retail, merchandisers are obligated to satisfy the inventory fulfillment requirements of brick-and-mortar stores and consumers. Instead of using human resources to scrutinize the distribution process, commodity-based companies are hiring database experts to implement business intelligence tools capable of providing actionable information regarding the supply chain.
What's redefining modern delivery systems?
E-commerce has allowed corporations to deliver their products to consumers residing in various parts of the country, creating variable demand for particular items. In order to anticipate customer desires, data analytics tools are being used to chart regional weather conditions, translate online surveys and monitor the distribution of materials. Jim Rice, a contributor to Supply Chain 24/7, stated that while transportation and storage processes cannot change the specifications of an item, they can revolutionize the way in which that particular product is delivered to a customer.
For example, a customized, direct-to-order method can transform consumer expectations. People don't want to wait to receive their purchased materials, even though an unspoken covenant was established the minute they finalized the order on a website. Therefore, database administration personnel employ programs that scrutinize which areas of the supply chain can be optimized to ensure that products are delivered as promptly as possible. The patterns these software solutions recognize are often overlooked by human eyes.
Enhancing global sourcing
Database engineering company Oracle recently announced the introduction of Oracle Global Trade Intelligence, a global commerce analytics application that provides organizations with the ability to leverage worldwide sourcing and distribution data to measure, predict and optimize the performance of their supply chains. Released in February, the program contains modifiable dashboards that enable enterprises to construct user-defined trade performance measurements that scrutinize import and export activities throughout the world.
Oracle experts and sourcing professionals are thrilled with the release, which also offers executives the chance to streamline communications between overseas departments. This process is expected to ensure that all materials are properly tracked, significantly reducing the chance of losing vital products. In addition, the program gives strategists the ability to anticipate the actions of both foreign and domestic competitors.
"Organizations are moving beyond automation of their global trade processes and are seeking ways to leverage their global trade data to make better business decisions," said Vice President of Value Chain Execution Product Strategy Derek Gittoes.
In the age of global commerce, it's imperative that companies possess programs akin to Oracle Global Trade Intelligence in order expedite the shipment of goods and reduce the cost for such products on the consumer's end.
Recently, the Heartbleed Bug has sent a rift through global economic society. The personal information of online shoppers, social media users and business professionals is at risk and database administration providers are doing all they can to either prevent damage from occurring or mitigate detrimental effects of what has already occurred.
What it does and the dangers involved
According to Heartbleed.com, the vulnerability poses a serious threat to confidential information, as it compromises the protection Open Secure Sockets Layer/Transport Security Layer technology provides for Internet-based communications. The virus allows anyone on the Web – particularly, cybercriminals – to view the memory of the systems protected by affected versions of OpenSSL software, allowing attackers to monitor a wide array of transactions between individuals, governments and enterprises and numerous other connections.
Jeremy Kirk, a contributor to PCWorld, noted that researchers at CloudFlare, a San Francisco-based security company, found that hackers could steal the SSL/TSL and use it to create an encrypted avenue between users and websites, essentially posing as legitimate webpages in order to decrypt traffic passing between a computer and a server. For online retailers lacking adequate database support services, it could mean the divulgence of consumer credit card numbers. If customers no longer feel safe in purchasing products online, it could potentially result in the bankruptcy of a merchandiser.
Think mobile devices are safe? Think again
Now more than ever, database experts are making concentrated efforts to effectively monitor communications between mobile devices and business information. As the Heartbleed Bug can compromise connections between PCs and websites, the same risk is involved for those with mobile applications bridging the distance between smartphones and Facebook pages. CNN reported that technology industry leaders Cisco and Juniper claimed that someone can potentially hack into a person's phone and log the details of his or her conversations. Sam Bowling, a senior infrastructure engineer at web hosting service Singlehop, outlined several devices that could be compromised:
- Cisco revealed that select versions of the company's WebEx service are vulnerable, posing a threat to corporate leaders in a video conference.
- If work phones aren't operating behind a defensive firewall, a malicious entity could use Heartbleed to access the devices' memory logs.
- Smartphone users accessing business files from iPhones and Android devices may be exposed, as hackers can view whatever information a person obtained through select applications.
Upgraded deployments of OpenSSL are patching liable avenues, but remote database services are still exercising assiduous surveillance in order to ensure that client information remains confidential.
The rise of the Internet of Things and the bring-your-own-device phenomenon have shaped the way database administration specialists conduct mobile device management. Many of these professionals are employed by retailers using customer relationship management applications that collect and analyze data from smartphones, tablets and numerous other devices. This level of activity creates a web of connectivity that's difficult to manage and often necessitates expert surveillance.
Managing the remote workplace
Merchandisers are challenged with the task of effectively securing all mobile assets used by their employees. Many of these workers have access to sensitive corporate information, whether it be a product development files, customer loyalty account numbers or consumer payment data. According to CIO, some organizations lack the in-house IT resources to effectively manage the avenues through which intelligence flows from smartphones to servers.
As a result, small and midsize businesses often outsource to remote database support services to gain a comprehensive overview of their BYOD operations. David Lingenfelter, an information security officer at Fiberlink, told the news source that the problem many SMBs face is that their employees are using their own individual mobile devices to access company information. Many large enterprises often provide their workers with such machines, so there's inherent surveillance over the connections they're making.
Moving to the home front
Small, medium and large retailers alike are continuing to use CRM, which provides these commodity-based businesses with specific information regarding individuals. IoT has launched the capabilities of these programs, delivering data from a wide variety of smart mechanisms such as cars, watches and even refrigerators. Information being funneled into company servers comes from remote devices, creating a unique kind of mobile device management for database administration services to employ.
Frank Gillett, a contributor to InformationWeek, noted that many consumers are connecting numerous devices to a singular home-based network, providing merchandisers with a view of how a family or group of apartment mates interacts with the Web. In addition, routers and gateways are acting as defaults for making network-connected homes ubiquitous.
"These devices bring the Internet to every room of the house, allowing smart gadgets with communications to replace their dumb processors," noted Gillett.
However, it's not as if the incoming information submitted by these networks can be thrown into a massive jumble. In order to provide security and organize the intelligence appropriately, remote DBA providers monitor the connections and organize the results into identifiable, actionable data.