Wednesday, June 9, 2010

My First Microsoft Office SharePoint Server 2007 Project Issues

Hi to all

Recently I worked on a MOSS 2007 project. During the development I faced lot of issues as I was very new on the technology but thanks GOD I survived.

Two of the issues I specially want to discuss here.

First, a strange link redirection problem and second large file corruption or failure while uploading in the Document Libraries.

Issue No. 1:

I have a Site Collection in which there are lots of Sites for different company departments. I was facing a very strange issue that if user click on any List/Library’s “Actions” Menu > Add to My Links” then user redirects to development server where the portal initially created. This was a really very strange problem for me because user’s My Site and Add to My Links at the top was working cent percent fine. First I thought it’s because of the backup and restore of the Site. But then after struggling a little bit I found a configuration entry in Root Level Site settings. Follow these steps to fix them.

Navigate to Root level site of your application’s site collection.

Click Site Actions > Site Settings > Site Collection Administration > Portal Site Connection. Verify that the link to the main content site and its name is correctly specified here.

Issue No. 2:

Second issue was more painful as some of the files while uploading to the server either got corrupt or didn’t upload at all. I was receiving Page not found error. I Started with 1 MB file . Bingo… it’s successfully uploaded. Then I tried to upload a near 50 MB file(but less then 50 MB). Ohh k. I got Page not found error. This mean the problem Is with large files. I check the Central Administration > Application Management > Site collection general settings and found that max file size is defined as 50 MB. After doing some Google I found that it’s the IIS issue and not the SharePoint problem. There is an special configuration entry needs to be defined in web.config to allow IIS to upload files more than 28 MB in size. Below is the entry needs to be defined inside the configuration tag region of the web.config file of web application.



Sunday, March 21, 2010

Sharepoint Custom Webparts Development

If you are new on sharepoint and havn't started the custom Webpart development yet (except creating data view/data form and other OTB[out of the box] webparts) read the below article on msdn . its great for sharepoint newbies :)

http://msdn.micrsoft.com/en-us/library/ms452873.aspx

Friday, February 20, 2009

Accessing Object's Properties at Run Time using .NET Reflection

Hi to All.

In my last project suddenly during design face I came across some what a different scenerio. It was something like that my UI Layer invoked Service Layer (generated by MS Software factory-Modeling E dition) which then invoked Business Layer.

Between UI and Service Layer I had Service object s (i-e Data contracts) and between Service and Business Layer there were business objects. The Problem was that there were some common fields in all of my service objects for audit purpose and I dind’t want to explicity set these fields for each insert and update operation from UI. Since the Service objects were actually Data contracts that were generated by a tool so I also didn’t want to alter the generated code because tool overrided the code each time someone regenerated the data contracts. I didn’t find any inheretance feature between data contracts so otherwise I would have gone for Base/Child pattern.

So what to do ?? I have objects that all are of different types and have common fields that needed to be set from a single location.

The Solution is .Net Reflection. Using reflection one can access objects properties and any other meta data information at run time yes right at run time.

Consider the code below it will tell you the whole story by itself.

public void AddAuditInformation(object entityObject, bool isNewRecord, int userID)

{

if (entityObject != null)

{

PropertyInfo propertyInfo = null;

//if entity is new. Mean we have to insert Createdby and Createddate fields as well.

if (isNewRecord)

{

propertyInfo = entityObject.GetType().GetProperty("CreatedBy");

propertyInfo.SetValue(entityObject, userID, null);

propertyInfo = entityObject.GetType().GetProperty("CreationDate");

propertyInfo.SetValue(entityObject, System.DateTime.Now, null);

}

propertyInfo = entityObject.GetType().GetProperty("LastUpdatedBy");

propertyInfo.SetValue(entityObject, userID, null);

propertyInfo = entityObject.GetType().GetProperty("LastUpdateDate");

propertyInfo.SetValue(entityObject, System.DateTime.Now, null);

}

}

Friday, May 23, 2008

Maintaing Soft Delete

Hi, if you are a developer dude you must know the very first concept of Foreign Key Constraint. While writing software applications which access the database, you do not need to worry about the foreign key constraint. DBMS maintains it for you and if you try to delete any row which belongs to a table (which doesn’t have cascade delete check enabled), you get a foreign key violation Exception and beengo!! You can catch the exception and show it to the user. Also it will be just the level of implementation you want, you can also catch the exception’s error code and can show a friendly message on the basis of that error code.

But what if your client belongs to some genius category and told you to make an application in which no record should be removed from the system. But application must also have the delete functionality so that user can delete the record from application but in database the record must be marked as “Deleted”. What you just have to do is to have an “Is_Deleted” bit column in each of your table. You will simply update its value to “True” on user delete action. At the same time all of your select queries must also include the check to get only those rows which have Is_Deleted equals to “False”.
But here comes a problem.
What if you a have a record in master table and that master table record also associate to some child table records. Marking that master table record as deleted will make your data inconsistent because now you will have some child records in your database whose master records are no more application level exists. Or in other words you have to maintain foreign key constraint on yourself.
A good but complete approach is to first check all child tables before marking any master table record as deleted. And if any of the child table records exists based on that master record then throw a custom exception from the delete sp/query.
Here is a simple sample procedure to accomplish the desired task

PS: My table’s IS Deleted Column’s name is “ISACTIVE_FLAG”


CREATE PROCEDURE [dbo].[sp_CORE_NA_SEAT_Delete]
(
@Original_NA_SEAT_ID int,

/***********Params required for Auditing*********/
@LAST_UPDATE_DATE datetime,
@LAST_UPDATED_BY bigint,
@IP nvarchar(1024),
@AUDIT_1 nvarchar(1024),
@AUDIT_2 nvarchar(1024),
@LOCATION nvarchar(1024)
/************************************************/

)
AS
SET NOCOUNT OFF;

IF EXISTS(
SELECT * FROM CORE_NA_SEAT INNER JOIN (SELECT * FROM CORE_ELECTORAL_AREA WHERE ISACTIVE_FLAG = 1) as E
on CORE_NA_SEAT.NA_SEAT_ID = E.NA_SEAT_ID
AND CORE_NA_SEAT.NA_SEAT_ID = @Original_NA_SEAT_ID)
OR
EXISTS(
SELECT * FROM CORE_NA_SEAT INNER JOIN VM_VOTER
on CORE_NA_SEAT.NA_SEAT_ID = VM_VOTER.NA_SEAT_ID
AND CORE_NA_SEAT.NA_SEAT_ID = @Original_NA_SEAT_ID)
OR
EXISTS(
SELECT * FROM CORE_NA_SEAT INNER JOIN VM_VOTER_REMOVAL_RECORD
on CORE_NA_SEAT.NA_SEAT_ID = VM_VOTER_REMOVAL_RECORD.NA_SEAT_ID
AND CORE_NA_SEAT.NA_SEAT_ID = @Original_NA_SEAT_ID)

BEGIN
RAISERROR ('National Assembly Seat can''t be deleted because it is being used by some other records.', 16, 1);
END
ELSE

UPDATE [CORE_NA_SEAT] SET [ISACTIVE_FLAG] = 0,
LAST_UPDATE_DATE = @LAST_UPDATE_DATE,
LAST_UPDATED_BY = @LAST_UPDATED_BY,
IP = @IP, AUDIT_1 = @AUDIT_1, AUDIT_2=@AUDIT_2,
LOCATION=@LOCATION

WHERE ([NA_SEAT_ID] = @Original_NA_SEAT_ID)




Monday, February 11, 2008

The hidden power of Table Adapters (.NET 2.0)

Hi to all.
We used table adapters to create DAL (Data Access Layer) in our last project so I decided to right a blog on their advantages and disadvantages. We were using SQL Server 2005 as RDMS. We had different choices for DAL but our lead decided to go for table adapters. Although it was looking silly at the start to work with them but at the end we find table adapters, a robust and modern approch for creating DAL.(i know there are lot of modern DAL generators available) The beauty of Table Adapters was in their one click re-configuration flexibility. Basically Table adapter is automatically created when you add a new typed Dataset in your project. Each table adapter is associated with a dataset and performs DB operations only on that dataset.With all their beauties Table Adapters also have some ugly thing. I hope these thing will be fix in next release of .net
I will explain you how you can use table adapters to generate DAL for your project. So here is how you will go. Open VS 2005 and create a new website with language C#. I named it as "Table_Adapter_Test_Site".







In website solution add a new project of type "Class Library" and named it as "blog.DAL".





Add a new Dataset in bog.DAL project and named it as "CORE_PROVINCE.xsd".



After adding the Dataset you will see a blue color blank, screen something like this.
It indicates that currently there is no data table in this typed dataset.


To add a data table in this typed dataset you need a connection with the database. If server explorer window is not open then open it from View>Server Explorer
In server explorer right click the "Data Connections" and click on "Add Connection..." tab. you will Add Connection screen like this. Provide data source, server name, credentials, DB name to connect to your database (MS SQL). Click on "Test Connection" button to verify that everything is ok till now.



You will see your newly added connection in the Data Connection list of Server Explorer. Expand Tables and drop any table on the Dataset. I dropped Province Table (states of Pakistan) on my dataset.



Note that a table adapter is automatically created with this data table and is attached in the bottom of the table. The newly generated table adapter (CORE_ProvinceTableAdapter) has two default Query based methods, Fill and GetData and both are used to SELECT data from data base.


At this stage you can directly use this table adapter to perform CRUD operations but to see what is inside running in this table adapter you can right click on the table adapter and click on "Configure..." tab.

You will see Select Statement for CORE_PROVINCE table. You can change the select statement and if required you can also use the Query Builder to generate complex (joined) select statements.




After clicking the next button you can change the names of the default select function of this table adapter. On the same window you can also see that table adapter also has generated direct method for CRUD operation on this DB table. Click next and then Finsh.




Rebuild the blog.DAL project, DAL for CORE_PROVINCE table is ready. Now you can explore methods which Typed Data Set provides for you. But i will more concentrate on user define DAL functions. Now consider if a user wants to add a complex update statement which involves CRUD operations on many DB tables as a single DB operation, he can add this functionality in some clicks. For example I have a update stored procedure defined in my Database and I want o use this procedure with this table adapter. Note that you can have much complex stored procedure and of any type. I created a stored proceduren amed as Update_Province_By_English_Name. This sp takes single parameter and returns number of rows affected. The sp is something like this.

Now to use this sp as a function of table adapter right click on table adapter and Click on "Add Query" tab.
On this window select "No Value" option and click next.

On the next window, select "Use existing Stored Procedure" and click next.



You will see the list of all stored procedures defined in the database. Select the sp that you want to use.



Provide the name of function which will execute this procedure. By default it is the same as of sp name.




Now u can see your newly added function in you table adapter.

Using this function is a piece of a cake.
Note that each table adapter created a new namespace so to use any table adapter, you have to add its namespace in your code file. You can expand the image below to get the idea of how to use this function.

Now we will discuss disadvantages of Table Adapters.
1. The major drawback of table adapter is that it stores Database name as a prefix in the XML definition of each function you added in it and also with the default Fill and GetData functions.
The problem occurs when you create a project using one database and eventually you need to change its name Now when u change the database name and try table adapter to execute that function you will get error.
To Solve this problem you can remove the database name prefix from the XML file of dataset (XSD). To do so Find "DBObjectName" in the DAL project.
You will see all of your table adapter's default and user defined functions have Database name as the prefix in their names. Remove all DB name prefixes from DBObjectName attributes and build your project again.



2. The second disadvantage of Table Adapters is that each table adapter uses a different connection object so if you want to combine two or more table adapter functions in a single Transaction scope (.net Transactions) then either you have to start DTC service (Distributed transaction coordinator) on your MS SQL Server or to follow the following work around.
In order to solve this problem without starting the DTC service, you have to write a function in all of your table adapter classes to assign a single connection object to all of their Commands.
To do so add a new partial class file (.cs) in your dataset (.xsd) and write the following code

namespace Kalsoft.ECP.CERS.DataAccess.CORE_PROVINCETableAdapters
{
public partial class CORE_PROVINCETableAdapter
{
public void AssignConnection(DbConnection sameCon)
{
if (sameCon != null)
{
SqlConnection sqlConn = sameCon as SqlConnection;
this._connection = sqlConn;
if ((this.Adapter.InsertCommand != null))
{
this.Adapter.InsertCommand.Connection = sqlConn;
}
if ((this.Adapter.DeleteCommand != null))
{
this.Adapter.DeleteCommand.Connection = sqlConn;
}
if ((this.Adapter.UpdateCommand != null))
{
this.Adapter.UpdateCommand.Connection = sqlConn;
}
for (int i = 0; (i < i =" (i" connection =" sqlConn;">

This method will assign the provided connection object to all commands present in this table adapter. Before calling the DB operation of any table adapter in a particular Transaction scope you need to call AssignConnection method.
3. The third and the last problem with the table adapters is that if you change the default schema of their CRUD operations, it does a very silly thing with DateTime ISNULL parameters of its update procedure.
Consider you have a DateTime parameter named as "UpdateDate" and this field is also nullable. Eventually you need to change the schema of delete procedure mean you have your own delete procedure and you want to execute that delete procedure when a row deletes from the data table. You can do it by right click the configure and assign required procedure to call upon when adapters update function call on a deleted mark row.
Everything will work fine but if you change any rows values and try to call update you will see an error that
"Failed to Convert Int32 from DataTime"
what happened ??
Actually table adapter marked IsUpdateDateNull parameter of its update procedure as DateTime but in actual, it expects int32.
To solve this problem, select table adapter > open properties window > click update Command > Open Parameter Collection > Change IsUpdateDateNull field type as int32. Rebuild the DAL and try again. It will surely fix this problem.

Welcome to my blog

Dear knowledge seeker.
First of all i would to tell something about me. My name is Muhammad Faheem Zafar. I did my BS from University of Karachi and currently i am working as the Software Engineer in a reputable software house of pakistan since april, 2007 .
I am passionate about software coding techniques and technologies and also would like to share my knowledge with other.
I hope you will find my blog interested and valuable at the same.