Transaction log file growth and database mirroring

If you have a database mirroring setup it is important to do regular transaction log file backups, if you do not maintain the transaction logs they will grow indefinitely and eventually fill up your drive and I have seen this happen.

When this what you need to do is backup the transaction log then shrink the log file using DBCC SHRINKFILE. Here is an example script which will backup the transaction log on a Blog database and figure out how much free space is available, then shrink the file down to the space used plus 10%.

BACKUP LOG Blog TO DISK = 'E:\MSSQLSERVER\Backups\blog_log.bak'
 
CREATE TABLE #TempPerf(
      [Database Name] VARCHAR(255),
      [Log Size (MB)] DECIMAL(12, 2),
      [Log Space Used (%)] DECIMAL(12, 2),
      [Status] INT,
      FreeSpace as ([Log Size (MB)] - ([Log Size (MB)] * ([Log Space Used (%)] / 100.0)))
)
 
INSERT INTO #TempPerf EXEC('DBCC SQLPERF(logspace)')
 
DECLARE @shrinkToMB INT;
 
SET @shrinkToMB = (
      SELECT TOP 1 (([Log Size (MB)] - FLOOR(FreeSpace)) * 1.1) as TargetSize -- FreeSpace + 10%
      FROM #TempPerf
      WHERE [Database Name] = 'Blog'
)
 
DBCC SHRINKFILE(Blog_log, @shrinkToMB)
 
DROP TABLE #TempPerf

This approach will not work in all cases, after running this query on a clients system which had this problem the transaction did not shrink, and we get a message as follows:

(1 row(s) affected)
Cannot shrink log file 2 (_log) because the logical log file located at the end of the file is in use.

(1 row(s) affected)
DBCC execution completed. If DBCC printed error messages, contact your system administrator.

After some googling this article came up:

http://blogs.technet.com/b/mdegre/archive/2011/09/04/unable-to-shrink-the-transaction-log.aspx

– Ran DBCC OPENTRAN which returned ‘No active open transactions’
– Value of ‘log_reuse_wait_desc’ is LOG_BACKUP

The ‘log_reuse_wait_desc’ column on the ‘sys.databases’ table indicates why the transaction log was not cleared or truncated, see this article http://www.sqlskills.com/blogs/paul/why-is-log_reuse_wait_desc-saying-log_backup-after-doing-a-log-backup/

Basically the current VLF is preventing the transaction log file from being truncated, one solution that is suggested here: http://dba.stackexchange.com/questions/64771/how-to-do-a-one-time-log-shrink-on-a-database-with-transactional-replication is switching the database to simple recovery (which will cause transactions to stop being replicated, probably need to re-initialize replication) then shrink the log and switch the database back to full recovery mode.

In actual fact you can’t switch the database to simple recovery model while mirroring is configured, SQL Server will not let you, you need to remove mirror then you can change the recovery model and shrink the log, then you will need to reconfigure mirroring.

References:

http://blogs.technet.com/b/mdegre/archive/2011/09/04/unable-to-shrink-the-transaction-log.aspx
https://social.msdn.microsoft.com/Forums/sqlserver/en-US/d1c533cd-aa7e-4774-9b85-b73ddf3b7873/sql-server-2008-r2-mirror-database-shrink-transcation-log

Load Workflow Class: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. —> System.Collections.Generic.KeyNotFoundException: The given key was not present in the dictionary.

I ran into an interesting issue today on one of the sharepoint servers we manage, the issue was affecting all workflows across a single site collection, the workflows would just fail immediately and gave a ‘Failed to Start (Retrying)’ message, when I looked at the logs I found these errors:

Load Workflow Class: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.Collections.Generic.KeyNotFoundException: The given key was not present in the dictionary.   
 at System.ThrowHelper.ThrowKeyNotFoundException()   
 at System.Collections.Generic.Dictionary`2.get_Item(TKey key)   
 at Microsoft.SharePoint.Workflow.SPWorkflowManager.IsConfigForSite(SPSite site)   
 at Microsoft.SharePoint.Workflow.SPWorkflowManager.GetWorkflowConfurationSection(SPSite site, String section)   
 at Microsoft.SharePoint.Workflow.SPWinOeHostServices.EnsurePluggableServices(SPSite site, SPWorkflowExternalDataExchangeServiceCollection services, ExternalDataExchangeService existingServices)   
 at Microsoft.SharePoint.Workflow.SPWinOeHostServices..ctor(SPSite site, SPWeb web, SPWorkflowManager manager, SPWorkflowEngine engine)     -
-- End of inner exception stack trace ---   
 at System.RuntimeMethodHandle._InvokeConstructor(Object[] args, SignatureStruct& signature, IntPtr declaringType)   
 at System.Reflection.RuntimeConstructorInfo.Invoke(BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture)   
 at System.RuntimeType.CreateInstanceImpl(BindingFlags bindingAttr, Binder binder, Object[] args, CultureInfo culture, Object[] activationAttributes)   
 at Microsoft.SharePoint.Workflow.SPWorkflowManager.LoadPluggableClass(String classname, String assembly, Object[] parameters)

 

System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.Collections.Generic.KeyNotFoundException: The given key was not present in the dictionary.   
 at System.ThrowHelper.ThrowKeyNotFoundException()   
 at System.Collections.Generic.Dictionary`2.get_Item(TKey key)   
 at Microsoft.SharePoint.Workflow.SPWorkflowManager.IsConfigForSite(SPSite site)   
 at Microsoft.SharePoint.Workflow.SPWorkflowManager.GetWorkflowConfurationSection(SPSite site, String section)   
 at Microsoft.SharePoint.Workflow.SPWinOeHostServices.EnsurePluggableServices(SPSite site, SPWorkflowExternalDataExchangeServiceCollection services, ExternalDataExchangeService existingServices)   
 at Microsoft.SharePoint.Workflow.SPWinOeHostServices..ctor(SPSite site, SPWeb web, SPWorkflowManager manager, SPWorkflowEngine engine)     -
-- End of inner exception stack trace ---   
 at System.RuntimeMethodHandle._InvokeConstructor(Object[] args, SignatureStruct& signature, IntPtr declaringType)   
 at System.Reflection.RuntimeConstructorInfo.Invoke(BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture)   
 at System.RuntimeType.CreateInstanceImpl(BindingFlags bindingAttr, Binder binder, Object[] args, CultureInfo culture, Object[] activationAttributes)   
 at Microsoft.SharePoint.Workflow.SPWorkflowManager.LoadPluggableClass(String classname, String assembly, Object[] parameters)   
 at Microsoft.SharePoint.Workflow.SPWorkflowManager.GetService(SPWorkflowAssociation association, SPWorkflowEngine engine)   
 at Microsoft.SharePoint.Workflow.SPWorkflowManager.RunWorkflowElev(SPWorkflow workflow, Collection`1 events, SPWorkflowRunOptionsInternal runOptions)

When I googled these errors I found this article and I tried his approach but no luck, so I opened up .NET Reflector and had a look at the IsConfigForSite method on the SPWorkflowManager class and found the following:

.net-reflector

The exception is being thrown when it’s trying to access the IisSettings dictionary using the SPUrlZone.Default enum, so I opened up a powershell console and had a look at what was in the IisSettings dictionary:

$site = Get-SPSite http://mysite
$site.WebApplication.IisSettings

This showed that there was no entry for the default zone in the IisSettings dictionary, initially I thought we just needed to reconfigure the alternate access mappings and add an entry for the default zone but this had no affect.

So I started discussing this issue with one of my colleagues who had worked on some issues with this sever the previous week. I knew he had made some pretty significant changes, and as it turned out one of those changes was he had extended the web application to the intranet zone and then deleted the previous web application.

This was the root cause of the problem, this is why there was no default entry in the IisSettings dictionary, so I extended the web application to the default zone and the workflows started working again.

Upgrading User Profile Service Application from 2010 to 2013

The first thing to do is make sure you environment is setup properly and all your service account and setup and configured correctly, and good point of reference is this Spencer Harbar’s guide to User Profile Service: http://www.harbar.net/articles/sp2010ups.aspx

Before upgrading your existing databases I recommend creating a new User Profile Service Application to confirm that you have your environment configured correctly, and make sure the User Profile Service Application is functioning correctly and both the ‘User Profile Service’ & ‘User Profile Synchronization Service’ services are started.

I scripted this in PowerShell using this script: https://gallery.technet.microsoft.com/Create-a-User-Profile-c2136dc0

Once you have a working User Profile Service Application it’s time to start the upgrade process, first delete the User Profile Service Application you just created and tick the ‘Delete data associated with the Service Applications’ check box, then backup and restore your Profile, Sync & Social databases from the 2010 environment to the new 2013 environment.

You will also need to export the encryption key from the 2010 environment using miiskmu.exe which is located in C:\Program Files\Microsoft Office Servers\14.0\Synchronization Service\Bin, just run this executable and follow the prompts you will need to enter the farm account credentials, then copy the exported key file to the 2013 environment.

Then using the same PowerShell script as before create the User Profile Service Application, make sure the database names in the script match the newly restored 2010 databases, the databases will be upgraded in the process of creating the User Profile Service Application.

Don’t be surprised if the ‘User Profile Synchronization Service’ service doesn’t start properly, you will need to stop both the ‘User Profile Service’ & ‘User Profile Synchronization Service’ services anyway to import the encryption key.

To import the encryption key you will need to log onto the server as the farm account, and run the command prompt as administrator, then run the following command:

miiskmu.exe /i "c:\path to you key file" {0E19E162-827E-4077-82D4-E6ABD531636E}

Note: The Guid in this command is the same everywhere, still not sure what it means or why you need it. Also if you do not log onto the server as the farm account for this process then your synchronization service will not start and will log errors like ‘User Profile Application: SynchronizeMIIS encounters an exception: System.NullReferenceException: Object reference not set to an instance of an object.’ to the ULS.

If all goes well then you should get a message window popup say operation was successful.

Now restart the ‘User Profile Service’ & ‘User Profile Synchronization Service’ services, and that’s it you should be up and running at this stage.

Once that is setup you might want to add the service proxy to the default group, which is one thing the above script does  not do, you can add the proxy to the default group via powershell like this:

$proxy = Get-SPServiceApplicationProxy -Identity 'guid of service proxy'

$group = Get-SPServiceApplicationProxyGroup -Default

Add-SPServiceApplicationProxyGroupMember $group.Name -Member $proxy

References:
http://community.spiceworks.com/topic/492436-starting-ups-synchronization-service-sp2013
https://social.technet.microsoft.com/Forums/en-US/e2cab627-9e8f-4f6e-91b2-fa873ce50940/user-profile-service-application-upgrade-issues
https://support.office.com/en-ie/article/Microsoft-Office-Servers-2010-FAQ-ReadMe-02aa8131-abd8-41e5-9dac-f317c0916f64
http://www.harbar.net/articles/sp2010ups2.aspx

Database Restructuring

A project I worked on a while ago where we were building a new web interface for an existing system which was originally built by another company, one of the requirements was that we use the existing database. Most of the tables in this database had primary key columns however there were no identity columns and they did not auto increment, this caused problems with our ORM framework when inserting new records, because the columns were not identity columns the LINQ to SQL could not get the the last inserted identity. The solution I came up with was to replace the existing primary key columns with new auto incrementing identity columns, this solution presented a number of challenges such as:

  • You can’t update identity columns so to get around this I move the data from each table into temporary tables, then drop the existing primary key column and create the new identity column and seed the new column to the max value of the old primary key. Then I can re-insert all the data with SET IDENTITY_INSERT ON.
  • Dropping the old primary key columns would violate foreign key constraints, therefore I cached the foreign key relationships in a temporary table then drop all constraints, once all the modifications have been made and all the data re-inserted I re-apply the foreign key constraints.

So to do this I wrote an SQL script which generates SQL to alter the database. First we load up all the names of the tables which are affected by this issue and setup some variables which will be used throughout the script:

DECLARE @tables TABLE(
	TABLE_NAME varchar(max)
)

-- get all the tables that need to be altered
INSERT @tables SELECT TABLE_NAME FROM INFORMATION_SCHEMA.COLUMNS 
		WHERE COLUMN_NAME = 'row_key' 
			AND TABLE_NAME NOT IN ('table_1', 'table_2') 

We also want to exclude any tables that have the ‘row_key’ column but with auto increment, next we want to copy all the foreign key constraints so we can recreate them when we are finished:

DECLARE @foreign_keys TABLE(
	AlterStatement varchar(max),
	ConstraintName varchar(100),
	TableName varchar(100)
)

-- copy the relevent foreign key constraints
INSERT @foreign_keys SELECT  'ALTER TABLE ' + object_name(a.parent_object_id) +
		' ADD CONSTRAINT ' + a.name +
		' FOREIGN KEY (' + c.name + ') REFERENCES ' +
		object_name(b.referenced_object_id) +
		' (' + d.name + ')' as AlterStatement, 
		a.name as ConstraintName,
		object_name(a.parent_object_id) as TableName
	FROM    sys.foreign_keys a
			JOIN sys.foreign_key_columns b
				ON a.object_id=b.constraint_object_id
			JOIN sys.columns c
				ON b.parent_column_id = c.column_id AND a.parent_object_id=c.object_id
			JOIN sys.columns d
				ON b.referenced_column_id = d.column_id AND a.referenced_object_id = d.object_id
	WHERE   object_name(b.referenced_object_id) IN (SELECT TABLE_NAME FROM @tables)
	ORDER BY c.name

When then generate the drop constraints SQL and print it out:

SET @constraint_cursor = CURSOR FOR SELECT * FROM @foreign_keys

-- drop the foreign key constraints
OPEN @constraint_cursor

FETCH NEXT FROM @constraint_cursor INTO @alter_statement, @constraint_name, @constraint_table_name
WHILE @@FETCH_STATUS = 0
BEGIN
	PRINT 'ALTER TABLE [' + @constraint_table_name + '] DROP CONSTRAINT ' + @constraint_name
	PRINT 'GO'
	
	FETCH NEXT FROM @constraint_cursor INTO @alter_statement, @constraint_name, @constraint_table_name
END

CLOSE @constraint_cursor;
DEALLOCATE @constraint_cursor

We then want to loop through the tables with a cursor and this is where we do most of our work here and I’ve decided not to breakup this section of SQL because I thought it would be easier to read as one block of code, I’ve just added comments throughout:

SET @table_cursor = CURSOR FOR SELECT * FROM @tables;
OPEN @table_cursor;

PRINT 'EXEC sp_msforeachtable ''ALTER TABLE ? NOCHECK CONSTRAINT ALL'''
PRINT 'GO'
DECLARE @row_key_max int;
FETCH NEXT FROM @table_cursor INTO @table_name;
WHILE @@FETCH_STATUS = 0
BEGIN
	-- get the max row_key of the current table
	DECLARE @ParmDefinition nvarchar(100) = '@row_key_max_out int OUTPUT';
	DECLARE @sql_to_the_max nvarchar(100) = 'SELECT @row_key_max_out = MAX(row_key) FROM [' + @table_name + ']'
	EXECUTE sp_executesql @sql_to_the_max, @ParmDefinition, @row_key_max_out=@row_key_max OUTPUT;
	
	-- get the primary key constraint
	DECLARE @pk_constraint_name varchar(100);
	SELECT @pk_constraint_name = Col.CONSTRAINT_NAME from 
		INFORMATION_SCHEMA.TABLE_CONSTRAINTS Tab, 
		INFORMATION_SCHEMA.CONSTRAINT_COLUMN_USAGE Col 
			WHERE Col.Constraint_Name = Tab.Constraint_Name
				AND Col.Table_Name = Tab.Table_Name
				AND Constraint_Type = 'PRIMARY KEY '
				AND Col.Table_Name = @table_name
	
	IF @row_key_max IS NULL
		SET @row_key_max = 0;

	PRINT 'SP_RENAME ''[' + @table_name + '].row_key'', ''original_row_key'', ''COLUMN'';';
	PRINT 'GO'
	
	-- get default value constraint for original_row_key
	DECLARE @default_value_constrint varchar(100);
	SELECT @default_value_constrint = c.name FROM sys.all_columns a 
		INNER JOIN sys.tables b 
			ON a.object_id = b.object_id
		INNER JOIN sys.default_constraints c
			ON a.default_object_id = c.object_id
		WHERE b.name= @table_name
			AND a.name = 'row_key'
	
	IF @default_value_constrint IS NOT NULL
	BEGIN
		PRINT 'ALTER TABLE [' + @table_name + '] DROP CONSTRAINT ' + @default_value_constrint + ';' 
		PRINT 'GO'
	END
	
	IF @pk_constraint_name IS NOT NULL
	BEGIN
		PRINT 'ALTER TABLE [' + @table_name + '] DROP ' + @pk_constraint_name + ';' 
		PRINT 'GO'
	END
	
	-- disable all triggers and add the new column
	PRINT 'DISABLE TRIGGER ALL ON [' + @table_name + '];'
	PRINT 'GO'
	PRINT 'ALTER TABLE [' + @table_name + '] ADD row_key INT PRIMARY KEY IDENTITY(' + CAST((@row_key_max + 1) as varchar) + ',1) NOT NULL;'
	PRINT 'GO'

	-- cache all the data in another table and delete everything in the original table
	PRINT 'SELECT * INTO [' + @table_name + '_temp] FROM [' + @table_name + ']'
	PRINT 'GO'
	PRINT 'DELETE FROM [' + @table_name + ']' 
	PRINT 'GO'

	-- drop the original_row_key column and set IDENTITY_INSERT ON
	PRINT 'ALTER TABLE [' + @table_name + '] DROP COLUMN original_row_key'
	PRINT 'GO'
	PRINT 'SET IDENTITY_INSERT [' + @table_name + '] ON'
	
	-- get all the column names for the table
	DECLARE @columns varchar(max); 	
	SELECT @columns = COALESCE(@columns+', ', '') + COLUMN_NAME FROM INFORMATION_SCHEMA.COLUMNS 
		WHERE TABLE_NAME = @table_name AND COLUMN_NAME != 'original_row_key' AND COLUMN_NAME != 'row_key'
	
	-- get all the value column names for the table
	DECLARE @value_columns varchar(max);
	SELECT @value_columns = COALESCE(@value_columns+', ', '') + COLUMN_NAME FROM INFORMATION_SCHEMA.COLUMNS 
		WHERE TABLE_NAME = @table_name AND COLUMN_NAME != 'original_row_key' AND COLUMN_NAME != 'row_key'
	
	-- reinsert all the data back into the original table
	PRINT 'INSERT INTO [' + @table_name + '] (row_key, ' + @columns + ') SELECT original_row_key, ' + @value_columns + 
					' FROM [' + @table_name + '_temp]'
	-- turn off identity_insert and turn the check constraints back on
	PRINT 'SET IDENTITY_INSERT [' + @table_name + '] OFF'
	PRINT 'ALTER TABLE [' + @table_name + '] CHECK CONSTRAINT ALL;'
	PRINT 'GO'
	
	-- drop the _tmp table and re-enable triggers
	PRINT 'DROP TABLE [' + @table_name + '_temp]'
	PRINT 'GO'
	PRINT 'ENABLE TRIGGER ALL ON [' + @table_name + '];'
	PRINT 'GO'
	
	SET @columns = NULL;
	SET @value_columns = NULL;
	SET @constraint_name = NULL;
	SET @default_value_constrint = NULL;
	SET @pk_constraint_name = NULL;
	
	FETCH NEXT FROM @table_cursor INTO @table_name;
END
PRINT 'EXEC sp_msforeachtable ''ALTER TABLE ? WITH CHECK CHECK CONSTRAINT ALL'''
PRINT 'GO'

CLOSE @table_cursor;
DEALLOCATE @table_cursor;

Then the only thing left to do is re-apply the constraints and we are done:

-- re-apply the foreign key constraints

SET @constraint_cursor = CURSOR FOR SELECT * FROM @foreign_keys

OPEN @constraint_cursor

FETCH NEXT FROM @constraint_cursor INTO @alter_statement, @constraint_name, @constraint_table_name
WHILE @@FETCH_STATUS = 0
BEGIN
	PRINT @alter_statement
	PRINT 'GO'
	
	FETCH NEXT FROM @constraint_cursor INTO @alter_statement, @constraint_name, @constraint_table_name
END

CLOSE @constraint_cursor;
DEALLOCATE @constraint_cursor 

Now you will need to do some testing to make sure the are no unwanted side effects from restructuring the database in this way, but in this instance for this system it worked perfectly.

Powershell, adding and connecting webparts

Recently had an issue with a powershell script I was writing for a sharepoint project I was working on, I needed to add two webparts to a page and connect them, they were an XsltListViewWebpart and a Nintex List Form webpart and I was following these articles:

http://social.msdn.microsoft.com/Forums/sharepoint/en-US/1cb661a0-b42f-4ec8-a42a-9a5d1fe1dce1/connect-xslt-list-view-and-query-string-filter-web-part-via-powershell?forum=sharepointdevelopment
http://troyvssharepoint.blogspot.in/2012/08/web-parts-connections-via-powershell.html

I could add the webparts to the page without any problems, but when I tried to connect them the connection wouldn’t save properly, and the problem turned out to be the i just needed to re-get the SPLimitedWebPartManager after adding the webparts, then create the connection.

And here is an example script:

add-pssnapin Microsoft.Sharepoint.Powershell
$url = http://mysiteurl
$web = Get-SPWeb $url
$wpGallery = $web.ParentWeb.Lists["Web Part Gallery"]
$wpManager = $web.GetLimitedWebPartManager($web.Url + “pages/test.aspx”,[System.Web.UI.WebControls.WebParts.PersonalizationScope]::Shared)

Write-Host ‘clearing page…’
while ($wpManager.WebParts.count -gt 0)
{
    $wpManager.DeleteWebPart($wpManager.WebParts[0])
}
$recordList = $web.Lists['Record List']
$viewFields = New-Object System.Collections.Specialized.StringCollection
$viewFields.Add(“RecordID”)
$view = $recordList.Views | where { $_.title -eq ‘RecordByID’ }
if($view -eq $null){
    Write-Host ‘creating RecordByID view…’
    $parameterBindings = ‘’
    $viewQuery = ‘{RecordID}’
    $view = $recordList.Views.Add(“RecordByID”, $viewFields, $viewQuery, 100, $True, $False, “HTML”, $False)
    $view.ParameterBindings = $parameterBindings;
    $view.Update()
}

Write-Host ‘creating filtered list view webpart…’
$recordByID = New-Object Microsoft.SharePoint.WebPartPages.XsltListViewWebPart
$recordByID.ChromeType = [System.Web.UI.WebControls.WebParts.PartChromeType]::TitleOnly
$recordByID.Title = “Record List”
$recordByID.ListName = ($recordList.ID).ToString(“B”).ToUpper()
$recordByID.ViewGuid = ($view.ID).ToString(“B”).ToUpper()
$recordByID.ParameterBindings = $parameterBindings;
$recordByID.TitleUrl = ‘$url/’ + $view.Url
$recordByID.WebId = $recordList.ParentWeb.ID
$wpManager.AddWebPart($recordByID, “Header”, 1)

Write-Host ‘creating nintex list form webpart…’
$wpl = $wpGallery.Items | where {$_.Title -eq ‘$Resources:NFResource,WebPart_List_Form_Title;’} 
$xmlReader = New-Object System.Xml.XmlTextReader($wpl.File.OpenBinaryStream()); 
$errorMsg = “” 
$webPart = $wpManager.ImportWebPart($xmlReader, [ref]$errorMsg) 
$webPart.Title = ‘List Form’;
$webPart.Mode = ‘Edit’;
$wpManager.AddWebPart($webpart,”Header”,1)

##### you must re-get the webpart manager #####
$wpManager.Dispose()
Write-Host ‘refreshing webpart manager…’
$wpManager = $web.GetLimitedWebPartManager($web.Url + “pages/test.aspx”,[System.Web.UI.WebControls.WebParts.PersonalizationScope]::Shared)
$consumerWebPart = $wpManager.WebParts | where {$_.Title -eq ‘List Form’}
$providerWebPart = $wpManager.WebParts | where {$_.Title -eq ‘Record List’}

Write-Host ‘getting connection points…’
$providerConnectionPoints = $wpManager.GetProviderConnectionPoints($providerWebPart)
$consumerConnectionPoints = $wpManager.GetConsumerConnectionPoints($consumerWebPart)
# find matching interfaces
foreach($pc in $providerConnectionPoints){
    $consumerCon = $consumerConnectionPoints | where { $_.InterfaceType -eq $pc.InterfaceType }
    if($consumerCon -ne $null){
       $providerCon = $pc
        break
    }
}

Write-Host ‘connecting webparts…’
$newCon = $wpManager.SPConnectWebParts($providerWebPart,$providerCon,$consumerWebPart,$consumerCon)
$wpManager.Dispose()
$web.dispose()

Write-Host ‘done’

Error running Android virtual device

About a week ago I ran into an error whilst following an android SDK tutorial (http://developer.android.com/training/basics/firstapp/index.html), google search seemed to indicate that the error means that you hard drive is full which I initially thought was bullshit but then I check the /tmp directory and found it was only 100M.

A little background, I am running Sabayon which is a linux distribution based on Gentoo, now apparently someone at Sabayon or Gentoo decided to implement /tmp as ‘tmpfs’ filesystem, tmpfs is a MEMORY-based filesystem so things that need to write to /tmp get to write to memory instead of disk so it’s pretty fast, but downside is that it takes up memory.

Anyway by default tmpfs is only 100M which is causing this problem.

error:

Starting emulator for AVD ‘vAndroid’
NAND: could not write file /tmp/android-mick/emulator-M2JFc8, Resource temporarily unavailable

solution:

Edit /etc/fstab as root and comment out the /tmp line, so from the command line run the following command as root:

gedit /etc/fstab

And comment out the following line with a hash (#):

tmpfs /tmp tmpfs noexec,nosuid,nodev,size=100M 0 0

Now reboot and you should be able to run the virtual device, /tmp will now be on your root partition which will give you plenty of space however /tmp access will be much slower, alternatively you could increase the size.