Anton's thoughts on consulting, project management and application development.

Update: Sam Saffron has fixed this issue in the latest version of Mvc-Mini-Profiler.

It took days to figure out why Firefox was aborting all requests after my login screen. So, I will make the conclusion of my search bold and underlined:

If you are using Mvc-mini-profiler and only logging to the database, you will eventually come to a time when the number of GUIDs that Mvc-mini-profiler inserts into your HTTP responses (X-MiniProfiler-Id headers) will grow to the point where Firefox will silently drop any requests to your website. Firefox will show GET requests with undefined response codes and refuse to load any further pages from your site until the Mvc-Mini-Profiler is turned off.

The fix for this is to create a custom SqlStorage provider for Mvc-mini-profiler and override the HasUserViewed field. Here is my attempt at this:

public class MvcMiniProfilerStorage : SqlServerStorage
    {
        public MvcMiniProfilerStorage(string connectionString)
               : base(connectionString)
        {
        }

        /// <summary>
        /// 	Stores  to dbo.MiniProfilers under its ; 
        /// 	stores all child Timings and SqlTimings to their respective tables.
        /// </summary>
        public override void Save(MiniProfiler profiler)
        {
            const string sql =
                @"insert into MiniProfilers
            (Id,
             Name,
             Started,
             MachineName,
             [User],
             Level,
             RootTimingId,
             DurationMilliseconds,
             DurationMillisecondsInSql,
             HasSqlTimings,
             HasDuplicateSqlTimings,
             HasTrivialTimings,
             HasAllTrivialTimings,
             TrivialDurationThresholdMilliseconds,
             HasUserViewed)
select       @Id,
             @Name,
             @Started,
             @MachineName,
             @User,
             @Level,
             @RootTimingId,
             @DurationMilliseconds,
             @DurationMillisecondsInSql,
             @HasSqlTimings,
             @HasDuplicateSqlTimings,
             @HasTrivialTimings,
             @HasAllTrivialTimings,
             @TrivialDurationThresholdMilliseconds,
             @HasUserViewed
where not exists (select 1 from MiniProfilers where Id = @Id)";
            // this syntax works on both mssql and sqlite

            using (DbConnection conn = GetOpenConnection())
            {
                int insertCount = conn.Execute(sql,
                    new
                        {
                            profiler.Id,
                            Name = Truncate(profiler.Name, 200),
                            profiler.Started,
                            MachineName = Truncate(profiler.MachineName, 100),
                            User = Truncate(profiler.User, 100),
                            profiler.Level,
                            RootTimingId = profiler.Root.Id,
                            profiler.DurationMilliseconds,
                            profiler.DurationMillisecondsInSql,
                            profiler.HasSqlTimings,
                            profiler.HasDuplicateSqlTimings,
                            profiler.HasTrivialTimings,
                            profiler.HasAllTrivialTimings,
                            profiler.TrivialDurationThresholdMilliseconds,
                            // BUG: Too many X-MiniProfiler-Id headers cause
                            // Firefox to stop all requests
                            //
                            // This hack marks all entries as read so that
                            // they do not end up part of that header.
                            HasUserViewed = true    
                        });

                if (insertCount > 0)
                {
                    SaveTiming(conn, profiler, profiler.Root);
                }
            }
        }

        private static string Truncate(string s, int maxLength)
        {
            return s != null && s.Length >
                        maxLength ? s.Substring(0, maxLength) : s;
        }
    }

Then it’s a simple matter of making sure that the Mvc-mini-profiler uses your version of this class rather than it’s own SqlServerStorage class.

MiniProfiler.Settings.Storage = new MvcMiniProfilerStorage("connStr");

This should take care of your Mvc-Mini-Profiler headaches.

Update: You may also be interested in my post about creating a dashboard to review Mvc-Mini-Profiler logs. Source code available on Google Code.

I have recently integrated the mvc-mini-profiler tool into JobSeriously. One thing that JobSeriously does is make good use of ActionFilters to handle cross-cutting concerns. The more I thought about it the more I recognized a need to see the performance of those custom filters as part of the Mvc-mini-profiler stream. After all, what good is it to see an action that takes a long time only to find out that most of the time is spent in your new spiffy ActionFilter?

So, I set out on a path to automagically profile all of the ActionFilters (and ResultFilters) throughout my ASP.NET MVC application. The first logical step, is to come up with a wrapper class that can wrap an ActionFilterAttribute and attach profiling to it (without altering the implementation or imposing overhead while not profiling).

Due to some under-the-hood implementation choices by the MVC team, we have to wrap ActionFilterAttributes based on the interfaces they support. In this case, it’s IActionFilter and IResultFilter.

    public class ProfiledFilterWrapper : IActionFilter, IResultFilter
    {
        private readonly IActionFilter actionFilter;
        private readonly IResultFilter resultFilter;

        public ProfiledFilterWrapper(IActionFilter actionFilter)
        {
            this.actionFilter = actionFilter;
        }

        public ProfiledFilterWrapper(IResultFilter resultFilter)
        {
            this.resultFilter = resultFilter;
        }

        public void OnActionExecuted(ActionExecutedContext filterContext)
        {
            using (MiniProfiler.StepStatic("Attribute: " + actionFilter.GetType().Name + ".OnActionExecuted"))
            {
                actionFilter.OnActionExecuted(filterContext);
            }
        }

        public void OnActionExecuting(ActionExecutingContext filterContext)
        {
            using (MiniProfiler.StepStatic("Attribute: " + actionFilter.GetType().Name + ".OnActionExecuting"))
            {
                actionFilter.OnActionExecuting(filterContext);
            }
        }

        public void OnResultExecuted(ResultExecutedContext filterContext)
        {
            using (MiniProfiler.StepStatic("Attribute: " + resultFilter.GetType().Name + ".OnResultExecuted"))
            {
                resultFilter.OnResultExecuted(filterContext);
            }
        }

        public void OnResultExecuting(ResultExecutingContext filterContext)
        {
            using (MiniProfiler.StepStatic("Attribute: " + resultFilter.GetType().Name + ".OnResultExecuting"))
            {
                resultFilter.OnResultExecuting(filterContext);
            }
        }
    }

Now that we have the AttributeWrapper, we need a way to wire it up. Fortunately, the MVC team made this relatively easy by providing the ability to customize the ControllerActionInvoker. Our next step is to create an ActionInvoker that will wrap all ActionFilterAttributes with our wrapper class.
Note: We certainly don’t want to wrap things more than once or wrap the actual ProfilingActionFilter that you may be using from the mvc-mini-profiler. That’s what those LINQ operations in this class are doing.

    using System.Collections.Generic;
    using System.Linq;
    using System.Web.Mvc;

    using MvcMiniProfiler.MVCHelpers;

    /// <summary>
    ///	Custom ControllerActionInvoker that wraps attributes for profiling with MvcMiniProfiler
    /// </summary>
    public class ProfiledActionInvoker : ControllerActionInvoker
    {
        protected override FilterInfo GetFilters(ControllerContext controllerContext, ActionDescriptor actionDescriptor)
        {
            FilterInfo baseFilters = base.GetFilters(controllerContext, actionDescriptor);

            WrapFilters(baseFilters.ActionFilters);
            WrapFilters(baseFilters.ResultFilters);

            return baseFilters;
        }

        private void WrapFilters(IList<IActionFilter> filters)
        {
            // Not tested with attribute ordering (may not honor the ordering)
            IList<IActionFilter> originalFilters = filters.ToList();

            // Avoid wrapping the ProfilingActionFilter (sometimes injected by MVC Mini Profiler) and Attributes that are already wrapped.
            IEnumerable<IActionFilter> wrappedFilters = originalFilters
                       .Where(t => t.GetType() != typeof(ProfiledFilterWrapper)
                              && t.GetType() != typeof(ProfilingActionFilter))
                       .Select(item => new ProfiledFilterWrapper(item));

            IEnumerable<IActionFilter> unwrappedFilters = originalFilters
                       .Where(t => t.GetType() == typeof(ProfiledFilterWrapper)
                              || t.GetType() == typeof(ProfilingActionFilter));

            filters.Clear();

            foreach (IActionFilter actionFilter in wrappedFilters)
            {
                filters.Add(actionFilter);
            }

            foreach (IActionFilter actionFilter in unwrappedFilters)
            {
                filters.Add(actionFilter);
            }
        }

        private void WrapFilters(IList<IResultFilter> filters)
        {
            // Not tested with attribute ordering (may not honor the ordering)
            IList<IResultFilter> originalFilters = filters.ToList();

            // Avoid wrapping the ProfilingActionFilter (sometimes injected by MVC Mini Profiler) and Attributes that are already wrapped.
            IEnumerable<IResultFilter> wrappedFilters = originalFilters
                       .Where(t => t.GetType() != typeof(ProfiledFilterWrapper)
                              && t.GetType() != typeof(ProfilingActionFilter))
                       .Select(item => new ProfiledFilterWrapper(item));

            IEnumerable<IResultFilter> unwrappedFilters = originalFilters
                       .Where(t => t.GetType() == typeof(ProfiledFilterWrapper)
                              || t.GetType() == typeof(ProfilingActionFilter));

            filters.Clear();

            foreach (IResultFilter actionFilter in wrappedFilters)
            {
                filters.Add(actionFilter);
            }

            foreach (IResultFilter actionFilter in unwrappedFilters)
            {
                filters.Add(actionFilter);
            }
        }
    }

Now that we have a way to wrap all of the attributes, we need some way to hook our ActionInvoker into the MVC pipeline. To do that, we need to create a custom ControllerFactory that will create use our ActionInvoker instead of the default one. The implementation below wraps the existing controller factory and swaps out the ActionInvoker. The reason I chose to create a wrapper rather than a concrete implementation is that I want to be able to use a controller factory from my IoC provider (Ninject). This pass-through approach gives us the best of both worlds.

    using System;
    using System.Web.Mvc;
    using System.Web.Routing;
    using System.Web.SessionState;

    /// <summary>
    /// A wrapper ControllerFactory which can be used to profile the performance of attributes in MVC.
    /// </summary>
    public class PerformanceControllerFactory : IControllerFactory
    {
        readonly IControllerFactory controllerFactory;

        public PerformanceControllerFactory(IControllerFactory controllerFactory)
        {
            this.controllerFactory = controllerFactory;
        }

        public IController CreateController(RequestContext requestContext, string controllerName)
        {
            var controller = controllerFactory.CreateController(requestContext, controllerName);

            var normalController = controller as Controller;
            if (normalController != null)
            {
                normalController.ActionInvoker = new ProfiledActionInvoker();
            }

            return controller;
        }

        public SessionStateBehavior GetControllerSessionBehavior(RequestContext requestContext, string controllerName)
        {
            return controllerFactory.GetControllerSessionBehavior(requestContext, controllerName);
        }

        public void ReleaseController(IController controller)
        {
            controllerFactory.ReleaseController(controller);
        }
    }

With everything nicely wrapped and ready to go, we just need to add 2 lines of code to the App_Start procedure of the ASP.NET MVC application to hook in our custom ControllerFactory. I would suggest you place this code at the end of your App_Start to ensure any other code you have is able to hook in what it needs first.

IControllerFactory factory = ControllerBuilder.Current.GetControllerFactory();

ControllerBuilder.Current.SetControllerFactory(new PerformanceControllerFactory(factory));

Update: The sample project is now available on Google Code.

In a departure from my normal blog topics, I thought I would dip a toe in writing about web development with ASP.NET MVC.  I have recently integrated the wonderful mvc-mini-profiler tool into JobSeriously and wanted to share how I set up a great dashboard for it.

They typical use case for mvc-mini-profiler is to embed a floating UI on the page which looks like a hovering chicklet in the corner.  Clicking the mvc-mini-profiler UI shows a popup box that has detailed timings.  This works if you only want to profile a few people (e.g. only the site developers) and you want these stats in their face all the time.  However, this does not work well if you want to profile a segment of your real users.  Additionally, this approach usually relies on authentication cookies which can be stolen (as happened to some folks on the StackOverflow site itself).  It would be bad enough to have a bad guy effectively impersonating you on the site, you certainly wouldn’t want to expose the internals of your code via your profiler as well.

What I wanted to do for JobSeriously was to log these timings to our database and be able to review the results on a dashboard that only admins could see.  Using out-of-the box functionality of the mvc-mini-profiler, I was able to set up logging to the database fairly easily.  Then I sprinkled in some Google Visualization API and was able to come up with a dashboard that looks like this:

My requirements were that the dashboard should help me target the worst offenders on the list without distracting me with outliers.  Therefore, I decided to use box plots to visualize the data.  For those of you who may not be familiar with box plots, the box portion shows every sample between the 25th and 75th percentiles.  The line at the top and bottom of the box plot shows the top and bottom 25% of samples. This helps you see if your distribution is skewed towards the top or  to the bottom (as is the case with JobSeriously).  Using the box plot also allows you to quickly see how much variability there is in page load times.  In short, the boxes represent what the majority of your users are experiencing while the wicks show the extremes. Read more about box plots on WikiPedia.

The first step to setting up the dashboard is to configure your database with the proper tables for mvc-mini-profiler.  Fortunately, the profiler comes with the necessary scripts built in.  However, the create script is actually embedded in the compiled code with no way to get to it.  I recommend that you manually extract the code and create your own create/rollback scripts that suit your environment.  The script is actually located at the bottom of the SqlServerStorage source code file (extraced SQL Code).

After configuring the SQL server itself, we need to configure the mvc-mini-profiler to log to the SQL server instead of outputting the results to the page.  Luckily, this is a single line of code change in the Globals.ascx.cs file.

MiniProfiler.Settings.Storage = new SqlServerStorage("your SQL connection string");

I placed the preceding code into my OnApplicationStarted method.

Next, we want to add a controller (or a method to an existing controller) that will display the results.  The code is very simple in that it just executes a single SQL statement and passes the data directly to the view we will create next.

    #region Imports

    using System.Collections.Generic;
    using System.Configuration;
    using System.Data;
    using System.Data.Common;
    using System.Data.SqlClient;
    using System.Web.Mvc;

    using Dapper;

    #endregion

    [Authorize(Roles = "Administrator")]
    public class PerformanceController : Controller
    {
        #region Constants and Fields

	// Change this to point to your SQL Server
        private readonly string connectionString = "Sql server connection string";

        #endregion

        #region Public Methods

        public ActionResult Index()
        {
            const string sql =
                @"select SRC.Name as WebRoute, count(SRC.name) as Samples, avg(DurationMilliseconds) as AvgD, min(DurationMilliseconds) as Low, max(DurationMilliseconds) as High, max(Ranks.Under10) as LowSample, max(Ranks.Over90) as HighSample, max(LowRanks.LongestDuration) as BoxLow, max(HighRanks.LongestDuration) as BoxHigh
from
(
	select Name,
		DurationMilliseconds,
		Dense_Rank() over (partition by Name order by DurationMilliseconds) as drank
	from MiniProfilers
) AS src
LEFT OUTER JOIN (
	select Name, floor( (max(src2.drank) - min(src2.drank)) * 0.25 ) + 1 as Under10, ceiling( (max(src2.drank) - min(src2.drank)) * 0.75 ) + 1 as Over90
	from
	(
		select Name,
			DurationMilliseconds,
			Dense_Rank() over (partition by Name order by DurationMilliseconds) as drank
		from MiniProfilers
	) AS SRC2
	group by name
) AS Ranks ON Src.Name = Ranks.Name
LEFT OUTER JOIN (
	select Name,
		DurationMilliseconds as LongestDuration,
		Dense_Rank() over (partition by Name order by DurationMilliseconds) as drank
	from MiniProfilers
	group by name, DurationMilliseconds
) AS LowRanks ON Src.Name = LowRanks.Name AND Ranks.Under10 = LowRanks.drank
LEFT OUTER JOIN (
	select Name,
		DurationMilliseconds as LongestDuration,
		Dense_Rank() over (partition by Name order by DurationMilliseconds) as drank
	from MiniProfilers
	group by name, DurationMilliseconds
) AS HighRanks ON Src.Name = HighRanks.Name AND Ranks.Over90 = HighRanks.drank
group by SRC.Name
order by BoxHigh DESC;";
            IEnumerable<dynamic> data;

            using (DbConnection conn = GetOpenConnection())
            {
                data = conn.Query(sql);
            }

            return View(data);
        }

        #endregion

        #region Methods

        /// <summary>
        /// 	Returns a connection to Sql Server.
        /// </summary>
        protected DbConnection GetConnection()
        {
            return new SqlConnection(connectionString);
        }

        /// <summary>
        /// 	Returns a DbConnection already opened for execution.
        /// </summary>
        protected DbConnection GetOpenConnection()
        {
            DbConnection result = GetConnection();
            if (result.State != ConnectionState.Open)
            {
                result.Open();
            }
            return result;
        }

        #endregion
    }

The code I used relies on a lightweight ORM called Dapper.  If you are not using Dapper, you may need to change the data = conn.Query(sql); line to use a DataReader to read the rows from the return set.

I know that SQL statement is a huge mess.  Unfortunately, MS SQL Server 2008 lacks some of the basic statistical functions required to make this a little simpler.  If you have any tips for how to make the SQL statement any better I would love to get some feedback.

The last step in creating the mvc-mini-profiler dashboard is to create the view.  You will likely need to change the MasterPage referenced in that example.  What the view does is include the Google Visualization API libraries and create the box plot with a table below it.  The table is page-able and sort-able.

<%@ Page Title="" Language="C#" MasterPageFile="~/Areas/Admin/Views/Shared/Admin.Master" Inherits="System.Web.Mvc.ViewPage<dynamic>" %>

<asp:Content ID="Content1" ContentPlaceHolderID="TitleContent" runat="server">
    Performance Reporting
</asp:Content>

<asp:Content ID="Content2" ContentPlaceHolderID="MainContent" runat="server">

<div id="visualization" style="width: 100%; height: 450px;"></div>
<div id="dataTable"></div>

<script type="text/javascript" src="https://www.google.com/jsapi"></script>
<script type="text/javascript">
    google.load('visualization', '1', { packages: ['table', 'corechart'] });
</script>
<script type="text/javascript">
    function drawVisualization() {
        // Populate the data table.
        var dataTable = new google.visualization.DataTable();

    	dataTable.addColumn('string', 'Route');
    	dataTable.addColumn('number', 'Low');
    	dataTable.addColumn('number', '25%');
    	dataTable.addColumn('number', '75%');
    	dataTable.addColumn('number', 'High');
    	dataTable.addColumn('number', 'Count');
    	dataTable.addColumn('number', 'Average');

        <% foreach (var row in (IEnumerable<dynamic>)Model) { %>
    	dataTable.addRow(['<%: row.WebRoute %>', <%: row.Low %> , <%: row.BoxLow %> , <%: row.BoxHigh %> , <%: row.High %> , <%: row.Samples %>, <%: row.AvgD %> ]);
        <% } %>

    	var dataView = new google.visualization.DataView(dataTable);
        dataView.setColumns([0, 1, 2, 3, 4]);
    	dataView.setRows(0, 9);

    	var table = new google.visualization.Table(document.getElementById('dataTable'));
        table.draw(dataTable, { page: 'enable', pageSize: 10 });

        // Draw the chart.
        var chart = new google.visualization.CandlestickChart(document.getElementById('visualization'));
        chart.draw(dataView, { legend: 'none', 'title': 'Highest peaks', 'vAxis': {'title': 'Milliseconds (1000 = 1s)'} });
    };

    google.setOnLoadCallback(drawVisualization);
    </script>
</asp:Content>

Please let me know if you have any questions regarding my code.  If you are looking for a job, I would love to get some feedback on JobSeriously as well.

In my industry, workflow is considered by many as a holy grail.  Most leaders of litigation support departments feel that they can control risks, issues, schedules, and quality through one comprehensive workflow or even a checklist.  It is tempting to think that a machine could orchestrate a project to such a degree of success that the humans involved cannot make errors.

Unfortunately, there is no perfect workflow in our industry.  Not a single litigation support manager I have spoken to feels that they have their hands around the problem.  And so, we look at vendor after vendor peddling their software with a state-of-the art workflow system built in.

So, why do these system’s fail to achieve their promised goals?  Here is my list of "gotchas" about attempting to automate your project management functions via workflow:

  • Poor planning – insufficient involvement by everyone involved with the system (IT, front-line employees, and management) during the early evaluation and planning stages of the implementation can lead to ballooning budgets and quirky systems forcing the employees to use workarounds and introduce all new and unknown risk into the process.
  • Poor understanding of the limitations of workflow – it may handle human-to-human task management but when you add human-to-system and system-to-system interactions, event management/correlation, performance monitoring, change management, and process rules many systems just fall apart.
  • Poor change management – sometimes the implementation of the workflow takes several months to complete.  Even with perfect planning at the beginning of an implementation, sometimes changes in the business outpace workflow development.  Many implementers fail to keep up with the pace business causing the delivered workflow to be out of date and unsuitable.
  • No implementation with LOB applications – users touch many systems during a business process.  Many of these interactions are not documented anywhere.  A good workflow system needs to take into account integration with billing systems, case management systems, etc.
  • Automating chaos – if your process is not clearly understood, defined, and optimized you will be automating chaos via workflow.  The result of this can only be "automated chaos."
  • Forgetting the people –  sometimes it is important to understand that it is not all about nuts and bolts.  It may not be about the lack of Change Management or the way integration is done with LOB systems.  It’s about PEOPLE.  Involve the users, educate the users and get buy-in up-front, let the users champion the project direction, talk to the front-line employees and narrow the gap between IT techniques and the Business requirements.

Keep these things in mind if you insist on purchasing a system that attempts to automate your project management.  However, I recommend that litigation support managers look spend some time and optimize their people and process before they attempt to automate things.  Sometimes, the right people are all it takes.

Technorati Tags: , ,

Craig Brown of Better Projects wrote an excellent post highlighting the reasons why a client might prefer to work with certain people even though other equally qualified people are available.

"Patients will [...] cross the city to visit their preferred GP [rather than visiting the nearest available doctor]. Patients are not in a good position to assess the quality of medical advice they receive, so what makes them care enough about a doctor to make such efforts? The answer is the "bedside manner" which in the context of this [project management / consulting] is their empathy."

Project managers and consultants need to empathize with their client, their employees, and other stakeholders.  Through empathy, one can truly understand the issues that the other party is dealing with and respond with sincerity.

Once, I was involved in a project where the client services manager had to relay a possible budget overrun to the client.  Unfortunately, the client had been under the impression that the supplier would absorb any cost overruns.  The first course of action that the client services manager took is to explain in detail to the customer how high the quality of our services has been and pointed out relevant passages in the contract to correct the client’s perception.  However, the client soured on the relationship feeling that he was being cheated.  When the issue was escalated, the executive from our company listened very carefully to the client’s pain points and empathized with him.  This led directly to an improvement of the client relationship and contributed to a successful negotiation and follow-on work.

Do you have any stories where empathy or lack thereof led to a change in the client relationship?  If so, leave a comment.

 

In my previous article, Good Insights On Managing Knowledge Workers, I concluded that "Empowering an employee is the only way to harness the talent that the company has and is a very challenging feat."  However, this is only the first step in the process.  Since posting that entry, I have been researching the psychology of motivation.

In psychology, the process of learning new behaviors or responses as a result of their consequences is called conditioning.  I believe that the average employee has been conditioned to follow orders, to keep quiet, and do the minimum amount of work.  Through their experiences at previous jobs or projects, employees have picked up an attitude that prevents them from accepting empowerment even if given full authority to make their own decisions.

The first job of a project manager in this situation is to condition the employee to respond positively to empowerment.  The PM has to encourage positive behaviors and diminish the negative ones.  There are four commonly accepted methods of reinforcement to do just that:

 

  • Positive Reinforcement. Something positive provided after a response in order to increase the probability of that response occurring in the future. For example, recognizing that an employee has stayed late last night and saying "thank you."  The most common types of positive reinforcement or praise and rewards, and most of us have experienced this as both the giver and receiver.
  • Negative Reinforcement. Think of negative reinforcement as taking something negative away in order to increase a response. For example, nagging the employee to fill out his weekly timesheet on time until they start doing it automatically. The elimination of this negative stimulus is reinforcing and will likely increase the chances that the employee will fill out their timesheet next week.
  • Punishment. Punishment refers to adding something aversive in order to decrease a behavior. The most common example of this is disciplining an employee for being late. The reason we do this is because the employee begins to associate being punished with the negative behavior. The punishment is not liked and therefore to avoid it, he or she will stop behaving in that manner.
  • Extinction. When you remove something in order to decrease a behavior, this is called extinction. You are taking something away so that a response is decreased.  An example of this is removing an employees internet access to discourage them from playing games during work time.

Being aware of how our actions reinforce behaviors is something that project managers need to keep in mind.  Have you ever let an employee slide with a poor excuse in a status meeting?  If you have, you just used positive reinforcement with an undesired behavior.

 

Raven at Raven’s Brain has posted a great quote regarding Good Insights On Managing Knowledge Workers that I think applies even more to the litigation support industry.  Litigation support is part of the information and support economy and most of the people in this industry are knowledge workers.  Knowledge workers are people who add value through their intellect rather than physical attributes.  Because knowledge workers use intellect rather than brawn, old techniques of managing users by just assigning tasks and jobs has become harder.  Workers are no longer doing one task at one workstation.  They have valuable skills that should be fostered and used in the best possible combination. 

"A good manager doesn’t tell people what to do or how to accomplish their tasks, but removes roadblocks and makes the way clear [for employees to be] more productive.  [Another] important role that managers have is to identify and grow talent.  Unfortunately, many managers don’t place enough emphasis on helping the individuals within their teams grow and improve their capabilities.  Ultimately, a manager [ONLY] succeeds when the people who reported to him grow into new capabilities and roles." — Jeffrey Phillips at Think Faster

Managers who do not foster each knowledge worker’s talents or hold on to the outdated "workers as resources" mantra will have a hard time keeping employees motivated in this industry.  Empowering an employee is the only way to harness the talent that the company has and is a very challenging feat.  That is something that I want to explore further in future posts.

How do you empower your employees?

Read the article on the Think Faster blog entitled "What’s a manager to do?"
Read the article on Raven’s Brain blog entitled "Good Insights On Managing Knowledge Workers"

Tags: , , , Litigation Support 

Follow

Get every new post delivered to your Inbox.