Archive for November, 2012

Users listed in People Picker

Posted November 26, 2012 By Kevin Bennett

Here are some quick notes I ran into when working with our People Picker, or list of all users on Sharepoint 2010.

To see all users of the site: http://SiteAddress/_catalogs/users/simple.aspx

But you cant delete multiple, can only delete each, so go to:
http://SiteAddress/_layouts/people.aspx?MembershipGroupId=0

This will let you delete mutiple users at a time .. huge time saver.

Thanks:
http://iedaddy.com/2010/12/sharepoint-2010-deleted-and-recreated-user-doesnt-have-permissions-to-site-access-denied/

but our people picker was still finding users from the old domain, So i found this: http://technet.microsoft.com/en-us/library/gg602075(d=lightweight,v=office.14).aspx

And set the PeoplePicker to only search our current domain for users. with command:
stsadm -o setsiteuseraccountdirectorypath -path “DC=,DC=local” -url https://SiteCollectionURL

Be the first to comment

This is a simple little script that I added on to a previous script to get Computer Bitlocker Report and Post the totals to Sharepoint 2010 List


# Check if the Sharepoint Snapin is loaded already, and load if not
if ( (Get-PSSnapin -Name Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue) -eq $null )
{
Add-PSSnapin Microsoft.SharePoint.PowerShell
}

$reportcsv = Import-CSV q:\.csv

$totalTrue = $reportcsv | Where-Object {$_. -eq "TRUE"} | Measure-Object HasBitlockerRecoveryKey -line
$totalFalse = $reportcsv | Where-Object {$_. -eq "FALSE"} | Measure-Object HasBitlockerRecoveryKey -line

#Setting our variables, Site name, List name, file to import and Caml
$spWeb = Get-SPWeb -Identity "https://"
$Summarylist = $spWeb.Lists[""]

#adds Report to SharePoint List
$item = $list.Items.Add();
$item["True"] = $totalTrue.lines;
$item["False"] = $totalFalse.lines;
$item.Update();

Of course you should replace the that I put in above with your info, but you get the idea.

Kevin

Be the first to comment

Getting Document Information on a SharePoint Site

Posted November 6, 2012 By Kevin Bennett

When I started at this new position they were going through a conversion from SharePoint 2007 to 2010 on the Intranet and Extranet. Actually the day before I started they attempted to the conversion and failed, so as soon as I finished my “in processing” I was pulled out of classes and put to work.

One of the first major projects (after getting the sites back up and running) was to perform a document clean up. As many company’s without a web governance standard knows after several years of uploading documents they didn’t have a handle on what was actually on the site. In order to actually see what was on the site I found/modified the following script (if I got it from you please comment if you find this so I can give credit where it is due).


function Get-DocInventory() {
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")
$farm = [Microsoft.SharePoint.Administration.SPFarm]::Local
foreach ($spService in $farm.Services) {
if (!($spService -is [Microsoft.SharePoint.Administration.SPWebService])) {
continue;
}
foreach ($webApp in $spService.WebApplications) {
if ($webApp -is [Microsoft.SharePoint.Administration.SPAdministrationWebApplication]) { continue }

foreach ($site in $webApp.Sites) {
foreach ($web in $site.AllWebs) {
foreach ($list in $web.Lists) {
if ($list.BaseType -ne "DocumentLibrary") {
continue
}
foreach ($item in $list.Items) {
$data = @{
"Web Application" = $webApp.ToString()
"Site" = $site.Url
"Web" = $web.Url
"list" = $list.Title
"Item ID" = $item.ID
"Item URL" = $item.Url
"Item Title" = $item.Title
"Item Created" = $item["Created"]
"Item Modified" = $item["Modified"]
"File Size" = $item.File.Length/1KB
}
New-Object PSObject -Property $data
}
}
$web.Dispose();
}
$site.Dispose()
}
}
}
}
Get-DocInventory | Export-Csv -NoTypeInformation -Path d:\temp\inventory.csv

The script will query the sites on the current farm (I would run this from the WFE Server) and return all the Documents. Note this also returned default.aspx and master pages but I would just filter those out of the spreadsheet.

Using one of my past writeups I would have another PowerShell Script upload to a List on our cleanup site and graph out the information so it looked good using Fusion Charts.

Conclusion of the Project BTW was 289k documents trimmed down to roughly 150k… at least it’s a start!!

Be the first to comment