Welcome to RenEvo Sign in | Join | Help
Home Wiisis Blogs Forums Photos Downloads About


  • IOC vs. DI vs. Composition

    First, lets define some terms in the basics:

    IOC: Inversion of Control
    IOC is quite simply, handing off responsibility to another system based on a known contract.

    DI: Dependency Injection
    DI is the act of, for lack of a better term, injecting those known contracts at runtime.

    Composition is the taking many of those known contracts, and acting on them all as one.

    As the years go by, it comes to me that more and more, these terms get mixed together, without any clarification of what they each really do. To help illustrate the differences in how these work, I am going to use a known system that utilizes all three of these. Hopefully at the end of this little sample, you will see what pattern/practice you really need to be using.

    Asp.Net Membership.

    Anyone who has ever done a website should have known about this specific static method call:

       1:  Membership.GetUser();

    This is a perfect example of Inversion of Control.  Behind the scenes, you are simply talking to a known contract (MembershipProvider) that then implements the abstract method “GetUser”. Instead of accessing the helper method “GetUser” you could talk directly to the default provider to validate a user:

       1:  bool validUser = Membership.Provider.ValidateUser("username", "password");

    This is no different, and no matter what MembershipProvider is being implemented, you know that this will return an expected result.

    This then begs the question, how does the Membership.Provider get populated with a MembershipProvider? This is where a minor version of dependency injection occurs.

    Deep in the delves of your machine.config, there is a setting that says the default Membership Provider is named “AspNetSqlMembershipProvider” and is of type: “System.Web.Security.SqlMembershipProvider”. If for any chance you went in your web.config and in the system.web/membership/provider collection and did a <clear /> and attempted to call the Membership.GetUser, it would throw an exception stating that there isn’t a default Membership Provider. In this case, the Membership static class is acting as a DI container for the Provider.

    From machine.config (C:\windows\microsoft.net\framework\v4.0.30319\Config\machine.config):

       1:          <membership>
       2:              <providers>
       3:                  <add name="AspNetSqlMembershipProvider" type="System.Web.Security.SqlMembershipProvider
       4:              </providers>
       5:          </membership>

    Now comes my favorite, Composition.

    The Membership static class provides not just DI and IOC, but it also provides for Composition, which is taking multiple implementations of a single contract and treating them as one. To take advantage of the Composite implementation of MembershipProvider you would use this type of code:

       1:              bool validUser = false;
       2:              foreach (MembershipProvider provider in Membership.Providers)
       3:                  try
       4:                  {
       5:                      if (provider.ValidateUser("username", "password"))
       6:                          validUser = true;
       7:                  }
       8:                  catch (Exception)
       9:                  {
      10:                      // TODO: Log Here
      11:                  }

    Essentially, we are asking ALL of the membership providers if they can validate this user, if so awesome, if not the user is not valid.

    *Adding a new membership provider through DI for composition/IOC

       1:      <membership defaultProvider="MyProvider">
       2:        <providers>
       3:          <add name="MyProvider" type="MyNamespace.MyProvider"/>
       4:        </providers>
       5:      </membership>

    The above sample would now attempt to call the AspNetSqlMembershipProvider as well as MyProvider to validate a user, while the Membership.ValidateUser would only call MyProvider.

    For this article I just wanted to take a few minutes to briefly touch on the subtle differences between these patterns/practices, as well as clearly separating out what IOC was. I have seen in many many cases where DI was used for Composition (Unfortunately most frameworks mix these two via Resolve<> and ResolveAll<>) or IOC.

    If you are just trying to make your code testable, use IOC and provide the required contracts in your constructor, with an optional default constructor that assembles them from working implementations.

       2:          public class RequiredInterface : IRequiredInterface
       3:          { }
       5:          public interface IRequiredInterface
       6:          {
       8:          }
      10:          public class RequiredInterfaceConsumer
      11:          {
      12:              private IRequiredInterface _required;
      14:              public RequiredInterfaceConsumer()
      15:              {
      16:                  _required = new RequiredInterface();
      17:              }
      19:              public RequiredInterfaceConsumer(IRequiredInterface required)
      20:              {
      21:                  if (required == null)
      22:                      throw new ArgumentNullException("required");
      24:                  _required = required;
      25:              }
      26:          }

    This is all that is required to make a class testable with dependencies, the IRequiredInterface can be mocked using any type of manual process, or mocking framework.

    Only use DI where you expect frequent enough changes to the implementation of the contract. You should always prefer orchestration over DI whenever possible.

    Only use Composition where it makes sense such as when you need to treat multiple objects like a single object.

    Once again, Composition/IOC do not require DI.

  • Facebook Site Thumbnails Tutorial

    Here is a quick tutorial for all you Facebook users.

    It annoyed me that when I added links from my website, the images that were given were “eh” at best, and usually from some random blog post. I had followed a tutorial that I found on using the image_src meta, but it just didn’t work for me.

    Here is my process

    First thing I did was open up my site in my browser, I then “zoomed” out a bit to get more content in the view (I use a widescreen monitor).  I then took a decent centered screenshot (OneNote screen clipping is awesome), in Photoshop I tweaked it a bit to my liking so that it was nice and centered.  I then exported as a 200x175 png file and named it “renevo.thumb.png”

    Uploaded to my web site, then added this tiny bit of HTML to the site header.

    <img src=”/images/renevo.thumb.png” alt=”” style=”display:none;” />

    This doesn’t effect my website’s layout at all, and since it is set to display:none “most” browsers will not download the image.

    And, for the result when sharing a link on Facebook.


    So… super easy way to add a 100% available website thumbnail.

  • A bad approach to adding RemoveAt to a Queue<T>

    Tonight on Stackoverflow a gentlemen asked if it would be possible to add an extension method to the Queue<T> class to remove an item by index. Anyone who has worked with this class knows that there is no Add or Remove methods, but instead a Dequeue and Enqueue pair of methods. Well, the problem with Dequeue is that it only removes an item from the end of the queue, so that's not really going to work for this guy.

    "I thought of a quick and dirty solution: Inherit from the Queue class and add your own RemoveAt method."

    Hey! Bloody brilliant, right? Wrong. The catch 22 is that there is no way to remove an item from the queue class itself except from the Dequeue method, which simply doesn't cut it. So to get around this I had to enumerate through all the items, add each item to a new queue, skipping the index we want to remove. Well, this is a neat solution, however it creates a new queue everytime you need to remove an item by index.

    It doesn't sound so bad, but consider the code:

    Custom Queue<T> class


    Test Code in Main()



    I knew as soon as I wrote this that it would be a significant performance hit, but I didn't realize just how bad until I looked a bit deeper. Essentially this means that each time RemoveAt is called, the queue is enumerated, and a new queue is created. If we have a queue with 5000 items, and we needed to remove 2500 of them, this would result in 12,502,500 calls to the Count property, and 12,410,000 calls to the Enqueue method. A lot, much? I didn’t get this data out of thin air though, I used Visual Studios performance tools, so let’s take a look at the actual benchmarks:




    Ouch! Approximately 9 minutes to remove 2500 items from a queue with 5000 items.. That is definitely not performance savvy. So I ran a comparison using a generic list.





    4.36 milliseconds… What a time difference! So you can see that a small “innocent” “work-around” can be absolutely detrimental to your applications performance. Oops! Luckily I thought ahead of time and told the guy someone else would probably post a better solution, and it was as simple as recommending a generic List<T>. :P

  • Integrating Paypal into ASP.Net

    There's no reason to write a whole lot of backend code, workarounds, or use third party Paypal controls just to get Paypal to work with ASP.Net. Actually, it only takes two lines of code in the Page_Load event. Let's take a look.

    Form.Action = "https://www.paypal.com/cgi-bin/webscr";

    Form.Method = "post";

    Using the above two lines of code in your codebehind, you can now just plop the input markup for the Buy Now, Add to Cart, or Donate buttons in your .aspx page. The reason copying and pasting the Paypal markup doesn't work in the first place, is because ASP.Net is nothing but forms. And you can't have a form inside a form. Of course using this method, the page has to be dedicated to submitting the data to Paypal, but you wouldn't normally have more than one form on a page anyway, so it doesn't really matter.

    This is the most elegant solution as far as I know. After my many hours of Googling around, I never found an easy solution until I realized I could just manipulate the main form.

    Important things to know

    You only need to use the Page_Load event if you are using MasterPages, otherwise you can just use the form tag in the individual .aspx page.

  • Upcoming Site Developments

    I have two websites that I am about to embark on doing, below is a scratch list of the technology that I will be using, as well as any additional brain dump info on them, there will be much more information coming later on these, but for now… Enjoy my tech.

    Base Framework

        ASP.Net MVC





        HTML Sanitation
            Allowed Tags
                <h1>, <h2>, <h3>
                    width="" (up to 999)
                    height="" (up to 999)


            jQuery to Prettify
                function styleCode() {
                    var hascode = false;
                    $("pre code").parent().each(function() {
                        if (!$(this).hasClass('prettyprint')) {
                            hascode = true;
                    if (hascode) { prettyPrint(); }


        .Net Open Id


    Spam Protection



    User Customization




            --Probably use the wavatar

        Default User Stuff
            Name (Pre-populated from openid)
            Email (Not displayed, used for gravatar)
            Real Name (Pre-populated from openid)
            Birthday (never displayed, used to show age)
            About Me (Markdown)

  • Reason for the Zune 30gb Bug

    Well, the source code has been found that caused the December 31st Zune Brick.

       1:  //------------------------------------------------------------------------------
       2:  //
       3:  // Function: ConvertDays
       4:  //
       5:  // Local helper function that split total days since Jan 1, ORIGINYEAR into 
       6:  // year, month and day
       7:  //
       8:  // Parameters:
       9:  //
      10:  // Returns:
      11:  //      Returns TRUE if successful, otherwise returns FALSE.
      12:  //
      13:  //------------------------------------------------------------------------------
      14:  BOOL ConvertDays(UINT32 days, SYSTEMTIME* lpTime)
      15:  {
      16:      int dayofweek, month, year;
      17:      UINT8 *month_tab;
      19:      //Calculate current day of the week
      20:      dayofweek = GetDayOfWeek(days);
      22:      year = ORIGINYEAR;
      24:      while (days > 365)
      25:      {
      26:          if (IsLeapYear(year))
      27:          {
      28:              if (days > 366)
      29:              {
      30:                  days -= 366;
      31:                  year += 1;
      32:              }
      33:          }
      34:          else
      35:          {
      36:              days -= 365;
      37:              year += 1;
      38:          }
      39:      }
      42:      // Determine whether it is a leap year
      43:      month_tab = (UINT8 *)((IsLeapYear(year))? monthtable_leap : monthtable);
      45:      for (month=0; month<12; month++)
      46:      {
      47:          if (days <= month_tab[month])
      48:              break;
      49:          days -= month_tab[month];
      50:      }
      52:      month += 1;
      54:      lpTime->wDay = days;
      55:      lpTime->wDayOfWeek = dayofweek;
      56:      lpTime->wMonth = month;
      57:      lpTime->wYear = year;
      59:      return TRUE

    If you didn’t catch it, December 31st was day 366, therefore the there was never any code in what to do if days == 366.  Woops…

  • New code snippet plugin for Live Writer

    I am trying out a new code snippet plug-in for Live Writer.

    Public Class CommitDB
        Public Function GetCommitStatusAll() As DataSet
            ' Create Instance of Connection and Command Object
            Dim myConnection As New SqlConnection(ConfigurationManager.AppSettings("NorthstarConnectionString"))
            Dim myCommand As New SqlDataAdapter("GetCommitStatusAll", myConnection)
            ' Mark the Command as a SPROC
            myCommand.SelectCommand.CommandType = CommandType.StoredProcedure
            ' Create and Fill the DataSet
            Dim myDataSet As New DataSet
            ' Return the DataSet
            Return myDataSet
        End Function
    End Class
    Lets see how it looks on the blogs!
  • PDC 2008 – For The Poor and Ill Located

    As you all know, or should have known, Microsoft’s Professional Developer Conference, or PDC, took place recently.  For those of us not in a position to visit, Microsoft has been so kind to post videos of all the sessions as well as their accompanying power point presentations.

    You can visit this Microsoft Blog Post to download and watch.

  • The Tools of the Trade

    So, as a .Net developer, there are a few “primary” tools I use, all for under $100.

    I thought I would spend a quick blog post to describe these tools a bit, how I use them, as well as what they are.


    First and foremost, over my many years of development, notepad has been my friend, some people will say “notepad++” or “notepad2” or some other derivative of the same application with a bunch of extra functionality.  But to be honest, sometimes I prefer the simplicity of notepad over a lot of other applications for simple scripting, html programming, PHP programming, as well as quick build of applications (more on that in another post).

    Some great features for notepad are number one, the speed, the application is super fast with a super small memory foot print. There are restrictions on the size of files, and it is possible to crash the application, but by far, it is probably one of the most stable programs on my computer.  Another great feature is the line/column position that shows up in the status bar, *you do have to enable it from the View menu*.


    This is literally another “have to have” applications. For about $40, you can’t beat the basic feature set. I use this program for taking notes during meetings with staff, customers, or just for myself to remember. With the ability to have multiple notebooks with multiple categories that contain multiple pages with the addition of sub-pages to those pages, it really helps my organizational pet peeves really find a home.  Each notebook is stored in a single “always saved” file, it can be shared, or ran locally, as well as transported between multiple computers with much ease. The “always saved” feature basically means, you never have to press save, you just click and type and its saved. As far as simplicity and ease of use, this is “the” note taking software.  I am actually writing my book in this application, then when I finish chapters, I transfer them over to Word (formatting is kept).  You get simple formatting, everything that you would expect from a Microsoft text editor, as well as full customization of the background and templates.  Finally on the note taking part of the application, you simply have to press WIN+N to open up a new note, or WIN+SHIFT+N to open the last notebook you where editing.

    Another HUGE feature for the application is the screen capture utility, simply press WIN+S and all screens get an opaque overlay that allows you to select any area on the screens.  From there you have a few options in how that data is captured, it can either be saved to the clipboard only, placed in a new “note” and kept on the clipboard where it then displays, or simply “filed” in a new note and kept on the clipboard. This has been THE tool that I use for making tutorials, some people rave about other screen capture tools, but for me, this is the easiest for me, one key stroke, and I can quickly choose what I want a screenshot of. By the way, all screenshots in this blog are taken with OneNote.

    The application runs in the Tray, and has a “light” memory footprint (250k), for what it does, I would rather run this than the alternatives.  The tool is so great, that it replaced Notepad for my note taking tasks.

    MSN Messenger

    Some people will definitely argue with this one, but this is a key communication point for myself, whether working with people in the office, or people working remotely, this application is much better than a phone call or getting up to walk across the office to speak to someone who might not be there.  For most “non-critical” communication I will simply fire off the question to the person I need to talk to, and when they come available, or online.  The offline messaging ability for the newer messengers really allow me to communicate better with people, especially those in other time zones.  Sometimes questions or communication is so small that an email just isn’t necessary. I log everything that I say or do in MSN Messenger, which makes it easier to go back previous communications, and I backup all of my history.  I have literally 6 years worth of MSN Messenger logs that I can pull back to search for questions I have asked other people, links people have given me, or simply try to remember some specific communications.

    I turn off a lot of options, specifically the window flashing and sound notifications, as when I am programming I don’t like to get interrupted with flashy distractions.

    Microsoft Visual Studio

    This one is kind of one of those no-brainers. If you are doing .Net development, you are probably going to be using Visual Studio. The great thing about this piece of software is that you can get some stripped down “Express” editions for free from Microsoft, and shelling out a few hundred bucks, you can pick up the standard edition with multi-programming language solution support. I’m not going to really go into detail why I use this software, but it does grant a mention in this post as it is one of the “tools of the trade.”

    .Net Reflector

    .Net Reflector, recently purchased by Red Gate Software, is a tool that allows you to disassembly and read reverse engineered code from .Net assemblies in all supported managed programming languages.  If you ever wanted to know how someone did something, simply use this tool to view the source code in your own programming language.  There are a few issues where you cannot copy/paste source code directly from the disassemble and compile it, but you shouldn’t be stealing source code anyway, this tool is for educational purposes, as well as debugging only. There are MANY plug-ins for the application available open source from Codeplex. I personally use the Code Search Add-in. Red Gate promises to keep the software free, but you probably will never find the source code for the application available to the public.

    Bug Tracking Software

    I’m going to cover this as fast as possible, but basically you “must” use bug tracking database of some sort, Joel Spolsky of Fog Creek Software says so in his 12 Steps to Better Code, and me, I believe him.  But no really, using bug tracking software helps you identify, track, and plan features and application bugs. Depending on your bug tracking software, you can get a lot more, or a lot less. If you don’t have the extra money laying around to afford some huge software bundle (such as Test Track, or FogBugz), use notepad, use OneNote, use something, period.

    Source Control Software

    This is another one of those “if you don’t have it, you are wrong” pieces of software.  Also discussed in the 12 Steps to Better Code.  A good source control package should contain at the minimal, versioning, rollback support, “some” sort of integration into either your file system (such as CVS and SVN), or into Visual Studio (Surround SCM, Team Foundation Server). Source control is another subject less talked about, but most needed, and in later articles I will go over some of the practices that I have implemented as the Source Control “master” for my company.

    And there you have it, those are the tools that I use on a daily regular basis, without these tools, I literally would just be lost and unable to function, they are that important to me.  There are some other tools that I use as well, such as this application (the one I am typing this blog with) Windows Live Writer.  It allows me to easily publish blogs, and spend as much time as I want tweaking them, writing them, etc… with local draft support, and multiple blog posting from a single post.

  • Power Testing

    What happens to your company when you lose power for say, 2 minutes?

    Until today, that question was never really asked at our company, like many others, we assumed full faith in Edison that our power will be stable, well, at least that’s the measures that we took to prepare.

    This morning, around 9:00am, our power flickered off then back on, probably due to the massive winds gusting up to 65mph. The experience thus far has literally been a nightmare. First off, the “few” computers that have APC battery backups beeped like mad, that is what they are supposed to do, but when the power came back on, they kept beeping, the batteries, having finally been used, have been destroyed.  Most of these backups are 5-10 years old, and have been used once or twice at most. This was the start of what I like to call, productivity hell.

    1. Phone Systems are down, not a big deal, but the computers that run them did not restart with the power coming back on (those old computers with non-state power buttons).
    2. Domains servers are down, one of them is not even on an APC, had one of those nagging “non-state” power buttons, you know the type, a home made PC that was used as a domain controller, and even though our company has upgraded almost every other PC in the building, these didn’t get updated.
    3. DNS is gone, once again, one of those “computers that stay off” when they lose power. Turns out our DHCP server (generally goes with the domain controller) took a dump when the system was shutdown improperly. Did I mention our Domains are still running Server 2000?
    4. Internet down, ok, this is an expected one, but when you have a help desk that uses internet to access the customers for support (via remote connections), as well as perform remote upgrades, then you have an entire development department that relies on internet and intranet for source control and bug tracking, it hurts. Not to mention the development that depends on 3rd party web services, which cannot be accessed. Not to mention those Hosted Solutions, about fifteen customers without access to the software we host for them.
    5. Source Control Down, as mentioned above, this one while not being effected by the nasty non-state power button on the computer, was completely locked out because of some “still unknown” router/switch issue that is preventing access to the source control, even by IP address.

    So, here it is nearly 7.5 hours after the “tiny” power flux and development is at a halt as we have a policy against working offline, due to junior developers never connecting back up, then submitting fixed bugs, and “oops I did a full get and overwrote my changes”, we have no internet to browse to at least do research while we are unable to program (ever hear that story about those things called books?  most of our developers refuse to read something that may cut them.) Our tech support and help desk departments are at a near halt, our phones are working, but the internet is up for a minute or two, then down for five to ten minutes at a time.

    The moral of the story? I would like to see power tests about once a quarter, monthly would be nice, but the scheduling requirements for that have a pretty big effect. What happens during the power test?

    • Power is cycled to the building for 20 seconds in the middle of the night during off-peak hours (three of four am)
    • On restoration of power, document every step to get the systems back and running to 100%.
    • After steps are documented meet with the nerds of the company to figure out how to reduce those steps and/or automate them.
    • Implement changes to reduce down time.

    With this type of preparing you are going to know what to expect, so when it happens during peek hours you have reduced and documented the full restoration of the system to 100%.

    Oh, Solitaire is a great filler for time!


    *Edit: Turns out this was an all day affair.

  • iGoogle Beta Access

    So, have you heard of the iGoogle beta yet?

    Basically, it is a newly updated view for the iGoogle home page that moves the tabs to the left side pane, adds the Google chat to the main page, as well as much more robust reading ability of gadgets.  For example, you can now browse each gadget independently to get a full view; like for instance if you navigate to the Gmail gadget, you get full Gmail interaction, without visiting the site, also any RSS gadgets will be expanded to allow you to read the posts within the expanded gadget.

    Back to the point, the iGoogle beta is currently a “developer” only beta, the good news is, you can try it by simply trying out the Google sandbox, answering a few questions (like your name), and having some fun.

    Turn on with this link

    Turn off with this link

    And that’s about it, personally I have been using the iGoogle home page almost since it was announced as beta, and without my little “clean” dashboard, I would go out of my mind.  Below is a screenshot of my iGoogle page, what’s yours look like?


    Bookmarks (link list)
    Ctrl+Alt+Del comic RSS Feed
    Coding Horror RSS Feed

    Want a RenEvo iGoogle gadget?

  • C# Background Compiling!!!!

    With Visual Studio 2008 SP1 (now in beta), C# gets one of those huge features that turned me off to the language in the first place, Background Compiling!

    What does this mean???

    Simply put, when you type in an error in the code, you know about it without having to compile, for example.

    Product is an invalid object, so therefore it gets a red squiggly without compiling, which is a huge step.

    Another example:

    Wrong return types, which is nice.

    This feature alone will probably get me into C# a lot faster then I was originally planning, this is what dynamic spell checking did for Microsoft Word in my opinion.

    Great feature add, and I can't wait for the "official" release of .Net 3.5 SP 1 as well as Visual Studio 2008 SP 1.

  • Web 2.0 Text - The Easy Way

      This tutorial will quickly outline the way to make a "Web 2.0" styled header image with minimal effort.

      What you need:

      • Photoshop CS3 (CS2 may work, not sure, been a while since I have used it)

      Open up Photoshop, and create a new document with the following settings:


      Next, add a black background just so you can get your bearings, and set the transparency to about 50%, this will keep the grid in the background, but tame it down quite a bit to make the top layers a bit more visible.

      Now, using the text tool, click anywhere and type in "Web 2.0 Logo". I use white in this tutorial since I can create an overlay later to change the color. For your font, set it to Segoe UI (or your other favorite font) and a size of 48. Now resize the text area so that your text is snuggly inside of it, this allows you to better position it later.

      Your logo should looks something like this now:


      Now comes the fun part, lets duplicate the text layer, name it "Reflection", select the Move tool, click on "Show Transform Controls" and simply drag the top of the middle gizmo down until you get a semitry that you like, see the bottom image for how I chose to lay it out. This sets up your "flat area", so the angle you use will force the eye to see the reflection at that angle. The longer the text is in the "reflection" side, the larger the slant will appear to your eyes for the reflected surface.


      Next we are going to add a layer mask to our reflection layer, click on the Square with the Circle in it on the Layers toolbox as highlighted below:


      By default, the layer will have 100% opacity (i.e. solid white).

      Select the layer mask on the Reflection layer to select it, choose the Gradient tool, press "D" on your keyboard to reset your colors. We want to select the Gradient brush that is white on the left, and black on the left.


      Now, using the gradient tool on the layer mask (you selected it right?), put your mouse cursur just below the loop in the upside down G, and drag the mouse downward while holding the right mouse button and the SHIFT key (this makes it a straight line), you want to drag it until about 3/4 of the way through the upside down text. Once you release the mouse, it should look something like this:


      Now, lets go over what type of surface we want to "reflect" on. Generally most reflective surfaces are not perfect copies, right now, ours is. There are some small adjustments that we can do, as well as some blending properties we can set that will give it a better look, without killing the actual text that is in the reflection, making it easier to update and change the colors as we please.

      So that we can get a better look at our reflection, lets change the background layer's opacity back to 100%.

      Not the most impressive logo yet.

      Move the reflection layer below the logo layer, go to the blending options for the Reflection layer, and add an outer glow with the following settings:


      This will now give us a Gaussian blur look without actually rasterizing our reflection layer.

      And finally, change the opacity of the reflection layer to 50%, this will tone it down and blend it into the background a bit better. Now your image should look a bit like this:


      That looks much better, but our logo is still way too high, lets move the reflection layer down a bit, so that it doesn't look like it is sitting on the surface, a good rule of thumb is to not cross the under case letters too far, else it will look like the letters are going "through" the bottom. After adjusting the height of the reflection, lock the two layers together so when you move one, you aren't separating them.

      So there we have it, a quick to update Web 2.0 Graphic that didn't take a lot of fancy tricks.

      With some small work to the background, adding some foreground text and a quick logo, you get a nice appearance to the graphic.



      Download the PSD

  • The daily mental thought - Yes or No?

    So I spend a good portion of my day bouncing back and forth between C# and VB.Net.  I have been a VB programmer since vb5 and even earlier writing VBA and VB Script as well as ASP.  Lately though it seems that no matter how hard I try, filling the software department at work is a challenge.  There are just so few .Net programmers who want to work in VB.

    With a new framework being built, as well as a web framework being built, it keeps popping into my head, do I just start farming for C# developers, and switch the house language over?  Do I run a mixed house of C# and VB? Or do I keep with the die hard thought of VB.Net is an easier language to read and understand, as well as the massive amount of right pinkie work I would need for C#.

    If you want proof of this issue, head to your local book store, you will find a huge array of "Start programming now in VB.Net!", or "Write a VB.Net program", and even the faithful "VB.Net for dummies", while across the shelf you will find more of "Pro C# Development in .Net 3.5" and "Expert Development in C#" etc... The quality of the books that are carried by book stores will show you what people are buying, as well as a good hint at what people are writing.  There are just more people working with C# and adapting it over VB.

    I am one of the favored arguers that C#, J#, VB.Net, Managed C++, etc... all compile to the same IL code, but when you look at the resources available to you, C# just proves to be more supported, not easier or more powerful (although the unsafe blocks in C# technically does make it more powerful, then again you have background compiling in VB.Net which makes it more powerful in design time, the list really does go on and on for the pros/cons).

    Anyway, I picked up two new books from Apress last night for ASP.Net 3.5 in C# and C# 2008 & .Net 3.5.  Ironically, they have sections devoted to VB.Net, so I guess I am not the only one out there who is seeing this pattern as well.

    Oh, and another quick thought, who decided that no one writes sockets and remoting in VB.Net to not have mentioned it, other then a quick "this is a socket, and this is how to send "hi" back and forth". Yet in the five C# books I own, they are extensively covered, including a lot about how the Marshaling actually happens, downs and ups, as well as different situations to use them.

  • The importance of Virtual PC

    Up until about 2 months ago I had not realized what I was missing out on.

    Microsoft Virtual PC is not only extremely useful, it is Free. Granted you have to pay for the operating systems you install, that is still one hell of a deal when you compare the cost of a test machine, the power to run it, as well as the KVM or storage locations for it.

    So, what is Virtual PC?  Virtual PC is a computer emulator that you can run on your desktop computers.  It loads up like a computer, it uses virtual hardware, and you can install Operating Systems on it.

    Why is this so important you might ask?  Well lets think of this phrase "It works on my machine...".  This has been a software mantra for quite a few years, and admitidly I used to over use it myself.

    During the software development process you don't find a lot of developers who will take the time to test it on multiple operating systems, clean machines, or other configurations that are common for your target audience.  Instead, you fix the bug, run it from the IDE, if it doesn't crash again from the Replication steps (if you had any), then you sign it off as good and either deliver a patch to the customer, or send it to QA. Let us hope you are doing the latter.

    Now, lets investigate this situation.  You do the following on your Windows Vista box, you send it to your QA guy, who also happens to be using Windows Vista, he signs it off as fixed, and it is deployed to the customer. Except the customer is using Windows XP, and that bug still exists in Windows XP because for some reason the Users folders are in totally different locations.  You never tested it in that environment, so you have no idea what is going to happen, yet again the customer lets you know that it is broken, and it comes back to you.

    How do you solve this customer loop back?  Very easily actually, you simply use Virtual PC to test on a few different environments before you send it to QA or the customer.

    Below is my list of Virtual PC's that I have setup, and use daily.

    • Blank install of Windows XP SP2 - Always up to date.
    • Windows XP SP2 with the Point Of Sale we integrate into.
    • Windows XP SP2 with Office 2003 - Always up to date.
    • Blank install of Windows Server 2003 R2 w/IIS & SQL Server 2000
    • Blank install of Windows Server 2003 R2 w/IIS & SQL Server 2005
    • Blank install of Windows Server 2003 w/IIS (non-R2)
    • Blank install of Windows Vista Business - (for 32bit testing, as I run Vista 64 bit)

    As you can see from the list above, it targets my entire client base.  We restrict our software to only be installed on XP SP2, Server 2003, Server 2003 R2, and Vista. At any time I can load up one of these PC's and do a quick test of the issue that is being reports on the operating system configuration that the customer is complaining about, and find the issue much faster, and make sure it doesn't happen on ANY of the configurations again.

    I suggest taking the time, as well as the Hard Drive space (each OS is about 5 gigs) and get this setup, once you have your targets, do a monthly burn to DVD to keep a clean backup with the latest updates so you can always regress back to a non-damaged environment.

    Note: Game development, or graphic heavy development will most likely require real computers for the graphic processing speeds, these are for more business development needs.

    Good luck!

  • Creating Automated Builds

    One of the stresses of working in software development is getting the applications built on a regular basis.  Generally builds are done on a Sr. Developers workstation, then sent off to testing attached to the bug software, via email, or a link to a shared resource.

    Getting automated builds up an running is a tricky thing, and with some complex projects requires a lot of scripting and knowledge of the internal software as well as some licensing issues with the software.

    When requested to get "Automated Nightly Builds" working by the VP of Software Development in a recent meeting, I looked at our 10+ software products and just shuddered. Below is a list of software that we use.

    Microsoft Visual Studio 2003, 2005, and Visual Basic 6 for Code
    Seapine Surround Source Control Management (SCM)
    Seapine Test Track Pro for Bug Tracking
    Install Shield 12 for some of our software installation media
    WinRar for patch packaging
    MSBuild from the .Net 2.0 Framework for OneClick installation generation and Satellite Resource File Compilations

    As you can see, building integration between a lot of those products and kicking off event driven automated builds is going to be tricky.  Although I have recently downloaded and implemented Cruise Control .Net for remote builds of one of our products, putting together the rest are going to be tricky.

    Let me explain the game plan that I will be attempting, and I will blog my progress with the pitfalls of the system.

    Pulling from source control

    The hardest part of this is going to be getting the code out of source control, granted it is just a quick pull of the files into a directory, but managing which branch to work out of is the trickiest.  Surround SCM allows you to do multiple branches of source code and we have implemented an implementation that allows our Junior developers to work on sub-branches, while I, as the only Senior developer at the moment, work in the main branch of the code.  On a daily basis, or as needed, I review the code and promote it from the Junior level sub-branch, retrieve the code, then check for a stable build.  This is obviously a process that can not be automated.  So firstly, I am going to have to create two builds, one for the in-testing code, which would be from the sub-branch, and a second for release code, which is pulled from the main branch.

    Compiling the code

    With the third party tools that we are currently using, some unique licensing has to be in place in order to compile the applications.  Although using command line compilers and not running out of the IDE might be able to just imbed current licensing, I am pretty sure that the licensing scheme of some of our assemblies will prove to require additional licensing.  A good example of this is the activation based Red Gate tools that we are using.

    Building the applications

    We have three different scenarios with the products that we produce. The first being the easiest, is a completely self contained application, all of the source code is stand alone, and requires no other shared projects.  The second scenario are web sites, these are going to require that we only pull the releasable files, such as ascx, aspx, asax, and binaries for instance. And the third is the most difficult, this one is our framework based applications.  We have a shared framework based on the Composite UI Application Block from Microsoft so that we can accommodate modular based application design.  When we want to add a new product, we simply create a new set of modules that load into the application framework.  Then we can distribute them and only turn them on if the client has licensing for them.  This is also the way that we are trying to move all of our applications into, our "OneSource" for applications, which also happens to be the name of our framework.

    Creating the OneClick Installs

    With our newest framework applications, we are using the OneClick technology from Microsoft that is built into .Net 2.0.  This is an outstanding way to get updates out to our customers without calling or emailing them to update their software. The largest problem with OneClick installations though is that they are statically bound to an install location so if you configure the OneClick to install from http://www.domain.com then you have to have to deploy it FROM that location.  This extra step is going to require that we additionally setup multiple staging environments for the OneClick installations.  The first staging area will be our internal developer only install.  This is used primarily for internal review of the latest software by the VP, Analysts, as well as Technical writers.  Secondly, we will want to create a Beta, or Pre-Release tab, this will need to be publicly accessible and could potentially be given to customers, the CEO, Project Managers, and the Help Desk for training on new features, as well as approval and additional QA.  The final staging that will need to be built is the Release staging.  This will basically be the released version of the software for everyone to install and will be made available via a public domain.

    Building the Install Media

    This one is by far one of the trickiest ones.  Due to the fact that files could be added or removed from the installations, knowing what to add, as well as adding it to the installer in the correct place before compiling, is going to prove to be extremely tricky.  To be honest, this is the area that I am least knowledgeable about how to do, and will be doing a lot of research on it when I get there.

    Building the Patch Media

    This one is pretty easy, basically we take the build directories for the software, add it to an SFX via a huge command line to WinRar.exe and it creates a nice little patch for us.  I could probably take it one step further generating a CRC32 compare list and only adding changed files, but this is a nice way to "Magic Zip" update previous installations that might not be completely up to date.


    Of coarse this entire process needs to be logged and sent to the right people if something goes wrong. Luckily this is where Cruise Control .Net is going to come into play, as it will log the entire process (by redirecting the standard output when it calls all of the executables) and save them to XML in the working directories of the builds. Myself and the VP also run the Cruise Control client application in the Tray that displays the status of each build, last build time, and any additional messages.  When a build is finished, we get a little popup in the bottom right hand of the screen letting us know if it was successful or not.  Obviously this is ok for working hour builds, but I would prefer to have more information.  So I will be setting up SMS alerting and Email logs to the end of the build.


    Basically that's it... not much work, if you like this sort of thing, but unfortunately I still have to maintain the constant stream of new features, updates, and bug fixes to the current software.

    I will keep you up to date.

  • "Not my code..."

    By definition in agile software development, no one owns the code.

    What does this mean exactly?  Simply stated, if you find a bug, you fix it, if you can refactor a process without breaking it, you refactor, if you find something that can use some updates, you update it.  There is no architecture meeting, no code change approval process, and definitely no "I found a bug in your code".

    I was faced with a situation where a junior developer had sent me an email during regular business hours (mind you I sit about 10 ft away from him) letting me know that there was an error in the code.


    From the code above any junior level developer could easily notice that the @MenuId was used instead of the @ButtonPropertiesId.  Instead of just changing the string name, I received an email at 5pm stating that the code was broken, and this might be the spot.  Why not just fix it?

    Let me restate a few principles from the Agile Manifesto, which our team tries to live by.

    The most efficient and effective method of conveying information to and within a development team is face-to-face conversation.

    This basically means talk to each-other, instead of sending an email to someone 10ft away from you, get up, stretch your legs, relax your eyeballs, and talk to the other developers.

    Working software is the primary measure of progress.

    So if the code works, you are progressing, if it does not work, then the progress does not continue until it does work.

    Continuous attention to technical excellence and good design enhances agility.

    If you spend some time doing it right, and "just fixing it", the progress is going to increase, hence the previously stated principle.

    In short, if you find something that is broken, you see how to fix it, you have the knowledge to fix it, why not just fix it?

    Now I do realize however that there are environments where the developers and programmers are confined to tiny spaces of code and can not dream of touching other portions of the codebase, but when you are encouraged to, and the team you are working with pushes the fact that the agile methodologies work, then double click on that variable and paste it above, then check it in and call Tom out on it, cause you just fixed a Senior developers mistake as a Junior.

  • Really now, it won't run on Vista 64bit?

    Well, I decided, what the hey right?

    I downloaded and installed the Windows Live Writer Beta 3, installed it on a Virtual PC (Windows XP SP2 32bit), then simply copied over the C:\Program Files\Windows Live\Writer folder that it created to my C: Drive, opened it up, and to my total surprise, and expectation, it works fine.  This operating system blocking for installers of .Net applications is just comical if you ask me.  The entire point of the .Net framework was to create an Operating System Agnostic development environment for developers without having to worry about 32bit vs. 64bit especially.  That is the wonderment of compiling to "AnyCpu".


    Oh what's that, That is this blog post running on Vista 64 in Windows Live Writer Beta 3.  Silly installers...

    On some other notes, I downloaded and played around with the Crysis Single Player Demo that released on Friday. To my dismay, my nVidia Quadro FX 3500 is incapable of running Crysis, I get all kinds of funky issues, such as objects not rendering (invisible), and only being able to see shaders, terrain, and shadows.  Let me tell you, that opening sequence inside of the cargo bay of the plane is quite wierd looking with floating eyes. I tried installing an 8500GT, the bonus was that it has 512 memory and DX10 support, the bad part was that it dropped my Vista score from 5.9 to 4.7, as well as every other game I play completely sucked and I had to turn everything down.  So, needless to say, I returned the 8500GT and put my Quadro back in.  I don't want to sacrifice a $1000 high-end card to play a $50 game, no matter how cool it is.  Hopefully Crytek hears my cries and will help me get it fixed, otherwise my own mods are in jeopardy of me helping, as I can't even run the game engine we are developing on.

    And finally, Yes, I was one of those geek's on Saturday night waiting out in front of GameStop to pick up Guitar Hero 3 for PS3.  I haven't had much time to play it yet, except the hour after I got home from picking it up. So far my opinions on it are that it is much better and responsive then the Guitar Hero 2, and even the note sequencing is a lot better for those of us that play guitar and moving down a scale while the music is running up a scale just didn't make sense in our heads (what kept me on hard instead of expert). I can finally sell off my 2 guitars, Guitar Hero 2, and that 80's tribute Guitar Hero (name slips my mind) on Craig's list or maybe e-bay.

    I decided to add the .zip of the Writer for ease of download.
    Download Writer Beta 3

  • Windows "Live"

    Originally some time last year when Windows Live started getting all of this attention, I was drawn in being a Microsoft Junkie and all.  Windows Live Messenger, OneCare, Live Writer, Hotmail Live, I was googie over them all.

    Then, came the dreaded "You can not install this program on that operating system".

    Lets back up a bit... As soon as Vista went RTM, I installed it, see above about how much of a Microsoft Junkie I am. Having a Dual Dual Core Opterons, installing Vista 64 bit seemed the obvious choice, so that I could take advantage of the extra processing capabilities of the 64 bit and cpus.  First message I get when installing all of my old software "Windows OneCare cannot run on this operating system.".  Are you kidding me?  A Microsoft Product not working on a brand new Operating System, specifically a brand new anti-virus system built for windows? So, I resorted to installing Avast, which seemed at the time to be the only 64 bit capable Vista anti-virus.

    Now, lets move forward to today... Once again, I got an itch to try out the Windows Live Writer, I must admit that I had been a bit lazy at posting to this blog, so I thought I would get back on track.  I head over to live spaces to pick up the newest Beta for Windows Live Writer to get the same message I got with OneCare. "You cannot install this program on this operating system".

    Ok, lets look at this logically, and with a bit of thought.

    Windows Live Writer is a .Net 2.0 application, most likely written in C#. I am a .Net 2.0 Developer that works in Vista Ultimate 64. Why the hell can't this "special" .Net 2.0 program run in Vista 64?  The .Net framework manages the mode (be-it 32 or 64 bit) when the program is JIT (Just In Time) Compiled from the IL (Intermediate Language).  So what the hell?  Even then, if they noticed some problems with the JIT, why couldn't they have just specified x86 mode, instead of "AnyCpu" configuration on compile?  I am currently developing an x86 compiled application right now that I have 0 problems running, debugging, and building on x64.

    I am getting a bit "iffy" about this whole Microsoft's direction with "Live", whats next, the newest patch to Windows Live Messenger won't run on x64?  Now that would be a killer, considering I use Live Messenger for communicating with co-workers on and off site.  When will Microsoft understand that x64 is a mainstream operating system, and they need to get thier apps up to thier own "Microsoft Certified Application" specifications

    Application must run in both 32 and 64 bit modes

    Anyway, Enough about live, lets see how my registration for "Live Custom Domains" goes, still have no idea what the hell it is, but it has an SDK, and I had a free domain name to play with.


  • Adding pages to Community Server 2007

    Adding Pages to Community Server 2007

    Well, here I am again, and I have finally taken a bit of my time to setup a new CS 2007 website. Even though the steps are pretty similar, there are some changes to the new Community Server.  Below I will outline the steps required to add new pages with skinning support to your Community Server 2007 website.
    In the previous article on adding pages to CS 2.1, I would have had you copy files and cut out content, for this article we will be creating everything from scratch.

    Step 1
    Start off by creating a new directory called "Sample" in your root web directory. (referred to web/sample/ from here on out).

    Now create a new text file called "Default.aspx", open up this file with notepad, and add the following lines.

    <%@ Page %>
    This is my placeholder, my actual file is located in web/themes/[THEME NAME]/sample/sample.aspx

    You should now be able to navigate to this page without any issues. (http://www.yourdomain.com/test/)

    Step 2 - Creating the themed page
    For this article, I will only be covering how to do this on the default theme, you can pretty much apply this to all other themes.

    In your web/Themes/ directory, create a new directory called "sample".

    Next, create a new text file and name it "sample.master", open up this file with notepad, and add the following lines.

    <%@ Master Language="C#" AutoEventWireup="true" MasterPageFile="../Common/master.Master" %>

    <asp:Content ContentPlaceHolderID="HeaderRegion" runat="server" >
        <CSControl:SelectedNavigation Selected="sample" runat="Server" />
        <CSControl:ThemeStyle runat="server" href="~/style/forum.css" mce_href="~/style/forum.css" />
        <CSControl:ThemeStyle runat="server" href="~/style/gallery.css" mce_href="~/style/gallery.css" />
        <CSControl:ThemeStyle runat="server" href="~/style/forum_print.css" mce_href="~/style/forum_print.css" media="print" />
        <CSControl:ThemeStyle runat="server" href="~/style/gallery_print.css" mce_href="~/style/gallery_print.css" media="print" />

    <asp:Content ContentPlaceHolderID="bcr" runat="server">
        <asp:ContentPlaceHolder id="bcr" runat="server" />

    <asp:Content ContentPlaceHolderID="rcr" runat="server" >
        <asp:ContentPlaceHolder ID="rcr" runat="server" />

    Just to explain a bit, we have created a new master page that inherits the master.Master for the website, we have overridden the header region to select the "sample" page, and finally we have created two content regions for the page (CS 2007 style).

    At this point, you can add any custom code to the your new pages "main" template, but for now, lets just leave it as is.

    Now, lets create another text file, and call it "sample.aspx" and open it up in notepad.

    The first thing we want to do in this file, is all of the standard asp.net page definitions, as well as import some common namespaces, useful for coding later on.  This also keeps our "ad support".

    <%@ Page EnableViewState="false" Language="C#" AutoEventWireup="true" Inherits="CommunityServer.Controls.CSThemePage" MasterPageFile="sample.Master" %>
    <%@ Import Namespace="CommunityServer.Components" %>
    <%@ Import Namespace="System.Collections.Generic" %>
    <%@ Register TagPrefix="CSUserControl" TagName="AdTop" src="../Common/Ad-Top.ascx" mce_src="../Common/Ad-Top.ascx" %>
    <%@ Register TagPrefix="CSUserControl" TagName="AdBottom" src="../Common/Ad-Bottom.ascx" mce_src="../Common/Ad-Bottom.ascx" %>

    Next, we will set the pages title, this is something that I can't remember seeing in CS 2.1, its quite useful for creating "sub" sites within your main site.  For now we will set the title to the default site name.

    <script language="C#" runat="server">

        void Page_Load()
            SetTitle(CurrentCSContext.SiteSettings.SiteName, false);


    Now we will move on to adding our bcr area or, Body Content Area.  This is much cleaner then the previous way of adding content parts, as you now don't have to be a html code guru and can simply enter in regular html code.

    <asp:Content ID="Content1" ContentPlaceHolderID="bcr" runat="server">
        <div class="CommonContentArea">
     <CSControl:AdPart runat = "Server" contentname="StandardTop" ContentCssClass="CommonContentPartBorderOff" ContentHoverCssClass="CommonContentPartBorderOn">
             <CSUserControl:AdTop runat="server" />

     <div class="CommonContent">
                <CSControl:ContentPart ContentName="sample" runat="server" ContentCssClass="CommonContentPartBorderOff" ContentHoverCssClass="CommonContentPartBorderOn">
                        <h2 class="CommonTitle">Sample Page</h2>
                        <div class="CommonContent">
                            <div style="line-height: 140%;">
               You have created a sample page!
     <CSControl:AdPart runat="Server" ContentName="StandardBottom" ContentCssClass="CommonContentPartBorderOff" ContentHoverCssClass="CommonContentPartBorderOn">
             <CSUserControl:AdBottom runat="server" />

    Finally, lets add our rcr area or, Right Content Area.

    <asp:Content ContentPlaceHolderID="rcr" runat="server">
        <div class="CommonSidebar">
     <CSControl:ContentPart ContentName="sampleSidebar1" runat="server" ContentCssClass="CommonContentPartBorderOff" ContentHoverCssClass="CommonContentPartBorderOn">
                    <div class="CommonSidebarArea">
                     <div class="CommonSidebarRoundTop"><div class="r1"></div><div class="r2"></div><div class="r3"></div><div class="r4"></div></div>
                     <div class="CommonSidebarInnerArea">
                      <h4 class="CommonSidebarHeader">Sidebar 1</h4>
                      <div class="CommonSidebarContent">
                                    Sign-in with your Admin account and double-click to edit me!
                     <div class="CommonSidebarRoundBottom"><div class="r1"></div><div class="r2"></div><div class="r3"></div><div class="r4"></div></div>

    Now save the file, and go ahead and close it.

    Adding the page to the SiteUrls
    Unlike CS 2.1, you must complete this next step for a page to even load in the web application.

    Open up the "SiteUrls.config" file with notepad.

    Scroll down to around line 562. You will see a comment that says "When adding custom locations, add above here."  We are going to do just that.

    Add the following just above the comment.

      <location name="sample" path="/sample/" themeDir="sample">
        <url name="samplehome" path="" pattern="default.aspx" vanity="{2}" physicalPath="##themeDir##" page="sample.aspx" />

    Now, before you go jumping off to your newly created page, we have a few more things to do, or else you are going to get some errors.

    Adding the link to the top menu
    With the SiteUrls.config file still open, scroll down the bottom to the "navigation" section.  Just under the "home" link, add the following.

    <link name="sample" resourceUrl="samplehome" resourceName="sample" roles="Everyone" />

    Save the file, and go ahead and close the SiteUrls.config.

    As you noticed in my previous articles, until you add the resource to the resources.xml, your link will not show up on the main link menu.

    Navigate to "web/languages/[YOUR LANGUAGE]/" and open up the file "Resources.xml" with notepad.

    Around line 44, you will see the text "<!-- Main Navigation -->"  just below the "Home" resource, add the following code.

    <resource name="sample">Sample</resource>

    Save the file.

    Restarting the web application
    At this point we need to just "touch" the web.config file, this is done by opening the file with notepad, saving, then closing the file.

    Navigate to your newly created page
    Finally, you should be able to goto your home page, click on the sample link, and see your new page!

    If you don't see it, start back at the beginning and find out what you did wrong.  Or post up in my forums for help and I will try to help you out.

    This concludes this article, all in all I think that adding and modifying pages in CS 2007 is much easier, and allows much more flexibility then the previous version.  So go out there, be creative, and see what you can do!

    Download Links

    You can download all of the modifications here: CommunityServer 2007 Sample Page

    You can see this article in action at: http://www.swglabs.com


  • So the journey went...

    As some of you may have noticed, I went a bit AWOL this weekend.  I had grand plans of working on some items in C&C FarCry, as well as getting a new rev out for SWG Labs.  Unfortunately, none of this happened.

    Friday night, as usual, me and the wife went for our date night, we had planned on going to watch Night at the Museum in IMax but the theater was WAY overpacked, and we just got some dinner at the Fox Grill.  This is where the journey began...

    My wife, being a fun type of girl she is, decided that we must find Guitar Hero 2 and 2 guitars just after Xmas where that game was one of the most sout after PS2 games it seems.  First, we head to Best Buy (1), no luck, other then a copy of Guitar Hero 1 and guitar combo. Next, we hit up Walmart (1). We visited a couple more places that memory escapes me of, yet to no luck.  I picked up a copy of some bargain bin stunt car game for PC, which still sits packaged on my desk.

    We head home after picking up some movies to watch, one being the Naked Mile.  Decent movie, a bit "cliche" on some of the parts of it, as well as having some seriously horrible actors ("Coos") comes to mind.

    Day 2

    Day 2 of the journey began with breakfast at Jack in the Box, then a trip to Target (pronounced Tar-jhey).  Nothing there, of coarse, off to Fry's, Toys'R'Us, Costco, EB Games, Games For Less, and then Circuit City.  Circuit city was a score, we found Guitar Hero 2 with one of the guitar controllers packaged.  Us being the fun people we are, wanted another controller, and remembered that Walmart sold wireless guitar controllers.  Off to Walmart.

    This is where luck smiled upon us.

    Browsing through Walmart, getting stuck behind the retardation that usually regulars Walmart, we finally get to the electronic section, where I head directly to the Playstation department, thinking we might just get another PS2 for upstairs so we don't have to deal with my brother-in-law and his 2 yr old mind of "mine, i wanna play".

    As I look in the case at the PS2's, I see a gold mine right next to them, a 60 gig PS3 sitting there with a golden halo.  Without even taking a breath, other then a small exhale that sounded like I spent too much time with Tim Taylor, I rush the Walmart stock boys to open the case and "gimme".  90% expecting them to say "oh, thats just the box for display purposes, its not really a PS3".  Instead, they say "oh wow, when did that get put in there...".  In my frantic speaking, I instruct the stock boy to grab me a copy of Tony Hawk's Project 8 to go along with that.

    $700 later, we head out to the car, being asked about 10 times where we got the PS3, my fear of being ganked was huge.  We got it in the car, and also had purchased a wireless guitar controller, thankfully the wife wasn't as shocked as I was about buying a PS3.  One thing I will say, the damn thing is heavy, like... really heavy, my PC weighs in about the same amount.

    We head home to do some playing on Guitar Hero 2, I had unpacked my PS3, but hadn't plugged it in yet, being the patient husband I am, I held in my anxiety to get started on the PS3 to spend some time playing a fun game with the wife.  Here is a hint for you boys out there who want your wife/girlfriend to be "ok" with you playing games, play games with her that she likes until she is tired of playing them, then you get free reign of the games you want to play while they do other things.

    I will say this first and foremost, if your TV sucks, get a PS2.  I first plugged in the PS3 to the 8 yr old 37" tv in the family room of my wifes parents house, other then Tony Hawk not going to standard 4:3 resolution properly, it looked just like PS2.  I was flabbergasted, pissed off, and needless to say, quite disappointed.  Still, I love Tony Hawk, so I spend about an hour getting familiar with the new game, new trick methods, etc...

    Me and the wife head to Target (again, Tar-jhey) to pick up some batteries for the wireless guitar, as well as a controller and new game for the PS3 so that she can enjoy it as well.  This is where I bash a company for making shitty hardware, so beware.  We pick up a Creative PS3 controller, having loved Creative for years, they need to go back to making speakers and sound cards.  The control, not only was it clunky due to the "cooling system", the analog controls would stick and seriously cause some stupidity in the controls.  After a 10 minute digestion of Tony Hawk MP with 1/2 the side of the screens missing, we went back to Guitar Hero 2 and completed the Career in Easy.

    Day 3

    Digging in the trash for my reciept for the controller, we head back to Target (tar-jhey) and exchange for a Sony PS3 wireless controller, and pick up Marvel Alliance.  We probably play for about 3-4 hours, the wife goes and gets ready for our new years eve party while I play Tony Hawk some more in career mode.

    Warning: After reaching over 25, your body can not consume mass amounts of alcohol like when you where 21.

    Ok, so we go to a party, I drink about 10 shots of tequila and a half a bottle of vodka in our favorite drink (Strip and go naked) and I am checking out my lunch from 3 days ago in the toilet.  Two days later I still feel like I was in a car accident.  It wasn't good.  I am pretty sure that that will never happen again, I don't think I could manage that, and my wife's patience was very good, I wouldn't want to have to put her though that again...


    Day 4

    My wifes parents are having a new years party downstairs, I pretty much stay in bed zoning in and out of sleep, trying to rehydrate myself until about 4pm where I finally make it out of the house alive.  We head to IKea to get some candles and other junk.  I love that place.  When we get home we watch "New World" or something like that with Colin Farell and Christian Bale.  I like movies, seriously I do, but this thing was a snorer, it was mostly about 2 people walking around in the grass with some plot thrown in there about people coming to start colonies in America.  Don't watch this unless you have insomnia and need to sleep.

    So, the wife is dead asleep, its 8pm, its time for more PS3.  I haul everything upstairs to our bedroom, and plug in the PS3 to my 23" HD Widescreen (using RCA, didn't have an HDMI handy).  I go through the configurations to put the system back in 16:9 etc...

    WOW, talk about a huge damned difference, the wife wakes up and we play through the Omega Base on Marvel Alliance, and the game is beutiful.  I can't imagine what it will look like in HD, it really can't get much better.  After about 3 hours, we hit the sack, and that brings me to this morning where I wake up feeling like a trainwreck again, but still having to goto work.


    All in all, it was a very eventful weekend, and I got to spend some quality time with the wife, which we have really needed here lately.  If you can find a woman who will drive you home, stop every block to let you puke out the door, help you inside, try to keep you hydrated, go to the store for you the next day, bring you food in bed, and keep a smile on her face, marry her, she really loves you.  Just make sure you do something special for her after acting like an ass like that.

    Now, tonight, I shall go home, take the wife out to a nice dinner, come home, play some PS3, pack it up for work (i got that messenger bag for it, its quite nice and worth the $60 to be able to safely transport it around) and head to bed for another great night.


    Some of you may wonder when I will get some time to work on RenEvo projects, that will take place most likely starting this weekend, I want to get some things in my personal life taken care of this week while I am home at night, as well as playing my new PS3.

  • Creating Your Own Community Server Skinned Control

    In my last article, I went into creating your own tabbed page that kept with the theme system of Community Server, today I will take it a step further and begin explaining on how to create a new control that will take advantage of the Community Server "skin" system.  First and foremost, you "should" have a local running copy of CommunityServer if you plan on doing any development for it, if not, now is a good time to set one up really fast.

    Lets start out by copying our Default.aspx from our last exercise and naming it SkinnedControl.aspx in the Sample Directory.

    Open up the newly created aspx and delete the "samplepageContentPart" CS:ContentPart control.  This will allow us to start from scratch with this article, feel free to leave the side menu, as it will be used later for linking the two pages the easy way.

    Next, double click on the right content part and add a link to the Default.aspx and the SkinnedControl.aspx for ease of navigating between the sample controls.  This can be done with the following html code.

    <H4 class=CommonSidebarHeader>Sample Page Links</H4>
    <DIV class=CommonSidebarContent>
    <li><a href="Default.aspx">Sample Page</a></li>
    <li><a href="SkinnedControl.aspx">Skinned Control</a></li>

    Now we have a nice side menu to be able to navigate between our sample pages, in future articles, you can simply add to this link list.

    Creating our Web Class

    This is where a decision needs to be made by you, you can code the control in any of the .Net languages, and for this article I have provided a VB and a C# project for the code.  You can download either VB Express or C# Express to complete this article.

    Open up Visual Studio, and create a new Class Library Project, name it "RenEvo.CommunityServer.Articles" and click OK.

    First and foremost, rename "Class1" to "SampleControl". In VB this may or may NOT change the class name in code, if not, go ahead and change it now.

    Click on References, add Reference, and then Browse to your CommunityServer website's bin folder from the Browse tab, and select the following assemblies.

    • CommunityServer.Components.dll
    • CommunityServer.Controls.dll

    Once again, Click on References, add Reference, and select System.Web from the .Net tab.

    And finally, we need to setup our build environment under C#, open up the properties window, select the Build tab, and browse to your CommunityServer's bin directory.  This will be used to essentially deploy your assembly for debugging.  In VB, double click on "My Project", select the Compile tab, and browse for the CommunityServer's bin directory.  You can alternatively setup the Release Configuration to build to the same directory, but since this is a sample, I will not go over the recommended flags and settings for release builds.

    Lets open up our SampleControl class code.

    To save some typing, we are going to import some namespaces into our class.


    using System;
    using System.Text;
    using System.Web.UI;
    using System.Web.UI.WebControls;
    using System.Collections.Generic;
    using CommunityServer.Controls;
    using CommunityServer.Components;


    Imports System.Text
    Imports System.Web.UI
    Imports System.Web.UI.WebControls
    Imports CommunityServer.Controls
    Imports CommunityServer.Components

    Next, we will inherit the CommunityServer's TemplatedWebControl


    public class SampleControl:TemplatedWebControl


    Public Class SampleControl : Inherits TemplatedWebControl

    Create a Region that will contain our Private Declarations, we will store a Title that will be settable in the aspx when creating the control on a page.


    #region Private Declares
    String m_Title = ""


    #Region " Private Declares "

    Private m_Title As String = ""

    #End Region

    Before we go and build, we still have a bit more to do, so stay away from that build button.  At this point we have created a new project, added some references to CommunityServer, and added our "codebehind" for our skinned control.  What you have noticed if you have ever developed web pages is that we have not added a .ascx to our project, the reason behind this is that we are using the CommunityServer framework to build and display our control based on some settings that we will apply next for "skinning" a control.  What basically happens is that when a CommunityServer page creates your control, it then calls the "AttachChildControls" function to allow you to bind any internal controls that you have for your control manually via "FindControl".  This allows developers to only put items that they need in any new skins that they create.  It is VERY important to check for nulls on any controls you "think" are on the page, as they might not actually have been included in the skin.

    Skinning Support

    Now that we have cracked our Class a bit, lets add some "skin" support to the code behind.

    Create a new region (hey, regions are fantastic ways to organize your code) with a title of "Skin Support", create a new protected variable called SkinnedControl of type WebControls.Repeater.  Do not set this to new, this will be our internal reference to our "imaginary" control in the skin.

    Next we will want to override two function in the base class (TemplatedWebControl) OnInit and AttachChildControls Both of these will be used for applying our skin at run time, and informing the framework exactly what files to use. OnInit we will simply specify our ExternalSkinFileName and then call the base class's OnInit.  In AttachChildControls we will attempt to set our SkinnedControl.  Below is the code snippets to accomplish this.


    #region Skin Support
    Repeater SkinnedControl;

    protected override void OnInit(EventArgs e)
    // set our skin name, this is the "./Themes/<currentskinname>/" folder
    base.ExternalSkinFileName = "Skin-SampleControl.ascx"

    protected override void AttachChildControls()
    // attach our reference to the control in the skin (if it exists)
    SkinnedControl = (Repeater)FindControl("SkinnedControl");



    #Region " Skin Support "

    Protected SkinnedControl As WebControls.Repeater

    Protected Overrides Sub OnInit(ByVal e As System.EventArgs)
    'set our skin name, this is the "./Themes/<currentskinname>/" folder
    MyBase.ExternalSkinFileName = "Skin-SampleControl.ascx"
    End Sub

    Protected Overrides Sub AttachChildControls()
    'attach our reference to the control in the skin (if it exists)
    SkinnedControl = FindControl("SkinnedControl")
    End Sub

    #End Region


    Databinding is a part of everything that makes a website function dynamically, without it, it would be a plethera of hard-coded table generation and response.write statements (remember asp 3.0 and php?).  Since we aren't quite covering data layers etc... yet in this article, we will simply create a fopa table in code and use it as a binding source.  To do this, we are going to override the base class's Databind method, call our base class databind function first, see if our SkinnedControl is set to an instance (to prevent errors, remember), create a sample datatable, then bind it to our control.  And once again, we will include this section inside of a region called "Databinding".


    #region Databinding

    public override void DataBind()

    if (SkinnedControl != null)
    // Create a sample table
    System.Data.DataTable dt = new System.Data.DataTable("SampleTable");
    // Add Columns
    // Add Rows
    dt.Rows.Add("1", "2", "3");
    dt.Rows.Add("4", "5", "6");
    dt.Rows.Add("7", "8", "9");

    // Bind it to our skinned control
    SkinnedControl.DataSource = dt;



    #Region " Databinding "

    Public Overrides Sub DataBind()

    If Not SkinnedControl Is Nothing Then
    'Create a sample table
    Dim dt As New DataTable("SampleTable")
    'Add Columns
    'Add Rows
    dt.Rows.Add(New String() {"1", "2", "3"})
    dt.Rows.Add(New String() {"4", "5", "6"})
    dt.Rows.Add(New String() {"7", "8", "9"})

    'Bind it to our skinned control
    SkinnedControl.DataSource = dt
    End If
    End Sub

    #End Region

    Next, we need to call the databind function, for our purposes, we are going to override the RenderContents method, check for postback, if false, then we will databind, then finally tell our base class to RenderContents.  Once again, we put the Render code inside of a Render Region


    #region Render
    protected override void RenderContents(HtmlTextWriter writer)
    // bind our data
    if (Page.IsPostBack == false)

    // this will apply our skinned asp:repeater here


    #Region " Render "

    Protected Overrides Sub RenderContents(ByVal writer As System.Web.UI.HtmlTextWriter)
    'bind our data
    If Page.IsPostBack = False Then
    End If

    'this will apply our skinned asp:repeater here

    End Sub

    #End Region


    Build and test

    At this point, we can take a break for all of you impatient people, and see if our control works or not, then we will get into customizing it a bit more.  First things first, hit the build button.  If you followed the code snippets correctly, you should have had a clean build, if not, go back and check what might have went wrong. C# may throw a warning that m_Title is set but never used, this will be rectified near the end of this article.

    Now, we need to create a new ascx for our skin, above we named it "Skin-SampleControl.ascx", so open up your CommunityServer/Themes/default/Skins folder and create a new text document aptly named "Skin-SampleControl.ascx".  If you would like, you can now open this file with your favorite editor, I prefer visual studio myself.

    Since we are doing a simple control without anything special on it, and obviously no code behind, we need to do no special directives at the top of the file.  Lets just add a simple asp:repeater control, name it "SkinnedControl" as per our code looks for this control, and then give it a very simplistic headertemplate that creates a table and column header row, an itemtemplate that just iterates our 3 sample columns, and a footertemplate to close the table.  Below is the ascx code, and how it should look for this sample.


    This is a sample control, this specific control only has a repeater on it.
    <asp:repeater id="SkinnedControl" runat="server">
    <table border="0" class="CommonSidebarContent">
    <td width="140" align="left" class="CommonBreadCrumbArea">Column1</td><td width="30" align="left" class="CommonBreadCrumbArea">Column2</td><td width="30" align="left" class="CommonBreadCrumbArea">Column3</td>
    <b><%# DataBinder.Eval(Container.DataItem, "Column1")%>
    <%# DataBinder.Eval(Container.DataItem, "Column2")%>
    <%# DataBinder.Eval(Container.DataItem, "Column3")%>

    Save the control, and now lets add the control to our SkinnedControl.aspx we created earlier in the article. (CommunityServer/Sample/SkinnedControl.aspx).

    Add the control to the page

    First we want to register a tag prefix for the project, lets add it just above the Import Namespace line.

    <%@ Register TagPrefix="SC" Namespace="RenEvo.CommunityServer.Articles" Assembly="RenEvo.CommunityServer.Articles" %>

    Next, just below our first AdPart, lets add the control to the page.

    <SC:SampleControl runat="server" id="SampleControl1" />

    Pretty easy huh?  Now, the moment you have been waiting for, lets go ahead and visit the SkinnedControl.aspx!

    If you have followed this article to the tee, you should have a little table with 3 columns and 4 rows the first one being the column headers.  If not, go back over the article and figure out where you went wrong.  No seriously, you didn't pay attention, start over.

    Ok, now I am going to tell you what that m_Title was for, as well as some tricks to make this a bit more snazzy.

    Customizing the Control

    Lets create a property that will wrap our m_Title internal variable. I love me some Regions...


    #region Properties
    public string Title
    get { return m_Title; }
    set { m_Title = value; }


    #Region " Properties "

    Public Property Title() As String
    Return m_Title
    End Get
    Set(ByVal value As String)
    m_Title = value
    End Set
    End Property

    #End Region

    Now, lets go ahead and do a generic way to output the "Title" of our control. In the RenderContents function, add the following code below the databind, but above the call to the base class to RenderContents.


    // pre writer stuff
    // optional title
    if (m_Title.Length > 0)
    writer.WriteAttribute("class", "CommonMessageTitle");


    'pre writer stuff
    'optional title
    If Me.Title.Length > 0 Then
    writer.WriteAttribute("class", "CommonMessageTitle")
    End If

    Now if we build and refresh our page, you will notice nothing changed, that is because we did not set our Title in the control definition.  Open up SkinnedControl.aspx and add the title="Sample Title" attribute to the SampleControl. When you save and refresh the page, you will notice now that you have a nifty title above the table!  Below is the code, incase you goofed up those easy instructions.

    <SC:SampleControl runat="server" id="SampleControl1" title="Sample Control" />

    That is pretty much it, but to clean it up, and show you a bit more of what is possible, I have added a bit more code to the RenderContents function that fully wraps the table with some custom styling.  Modified RenderContents below


    #region Render
    protected override void RenderContents(HtmlTextWriter writer)
    // bind our data
    if (Page.IsPostBack == false)

    // pre writer stuff
    // optional title
    if (m_Title.Length > 0)
    writer.WriteAttribute("class", "CommonMessageTitle");
    // our wrapper for the content
    writer.WriteAttribute("class", "CommonMessageContent");
    writer.WriteAttribute("id", this.ID);

    // this will apply our skinned asp:repeater here

    // close our content wrapper


    #Region " Render "

    Protected Overrides Sub RenderContents(ByVal writer As System.Web.UI.HtmlTextWriter)
    'bind our data
    If Page.IsPostBack = False Then
    End If

    'pre writer stuff
    'optional title
    If Me.Title.Length > 0 Then
    writer.WriteAttribute("class", "CommonMessageTitle")
    End If

    'our wrapper for the content
    writer.WriteAttribute("class", "CommonMessageContent")
    writer.WriteAttribute("id", Me.ID)

    'this will apply our skinned asp:repeater here

    'close our content wrapper
    End Sub

    #End Region

    End of article wrap up

    Throughout this article you have gotten a decent grasp on how the skin system works with CommunityServer, it is not nearly as overwhelming as some might imagine, and is actually kind of fun once you get the hang of it, something about binding a code blindly to an ascx really got me fired up that in my own product when the need arose to have a secondary view format for a specific web page, I took a similar approach with creating multiple ascx files that had the same code behind, and it worked out wonderfully.

    Some key things that I would like to mention about this article is, the original article had rave reviews and fantastic feedback, which prompted me to want to write more, as it seems that I have something to offer to the CommunityServer community.  I have provided with this article all of the files neccessary to get the sample running on your site before even cracking the code which is available in C# and Visual Basic from the download link below.

    Again, thanks to everyone for thier support, it is really encouraging to find that you spend time writing an article and it gets such great responses.

    Download Article Code

    See the online demo!

  • Dante broke the egg

    Far Cry's crygame.dll - the comstock load for all the game logic in Far Cry - has come into focus for the C&C Far Cry project.   FINALLY.  The tears you see: they are real.  It's taken me awhile to get my composure straightened out.  I'm sooo happy.  *gasps for air*

    Give me a minute... talks amongst yourselves.

    Ok, back to reality.  Pardon the stupidity of that opening.  Being a software engineer (just half a year shy of receiving my bachelors of science) I am most comfortable in the C++ programming language.  I can solve any problem, write any rule, and do it very lean and fast.  What that means for you is a better game in a shorter amount of time.  Up until now, all of our code has been situated on the Lua (an external scripting language comparable to mIRC scripting only a lot more open ended) for several reasons, namely to speed up development and to have the code out in the open for others to build off of and modify.  For us, having a strong mod community is important, and Lua lets us build a strong game that at the same time is very accessible for you to build a strong mod. 

    Now just because we're migrating some of our code into C++ does not mean we are compromising our principles.  It means just the opposite.  Our intentions are to hide only the nitty-gritty details that are heavy in math and computational paradigms.  We can speed up these items, making the game faster and overall better for you.  With access to the net code, rendering methods, and script objects, we can improve the performance of the game while also putting newer items into Lua that you - the modder - can use.  Items which would otherwise be totally inaccessible for the average mod team.

    For example: Far Cry has no actual tanks anywhere within the shipped version of the game.  They are not part of the game. They are in C&C Far Cry.  Because of this, turrets on vehicles update their orientations unrealisticly when compared to how a tank works.  Where ever you look, that is where your turret points.  By gaining access to the DLL, we will be able to implant a capped movement speed on turrets, forcing them to move towards where you are looking at at a defined speed, making it look and feel more realistic.  We - and by that includes you - would not have been able to do this from the available resources in the Lua environment for this game.  What we will do besides add this ability is also extract the speed setting for a turret on a vehicle to the Lua script for that vehicle, enabling you to define this max speed from inside the Lua script.  Now something that you couldn't do can now be done very, very easily.

    It's 3 in the morning.  Cut me some slack here. 

    Over the holiday period, we will be porting a lot of foundation code into the DLL and perform some clean up in our Lua stock code.  Some core features - like the one mentioned above - are also planned.  C&C FarCry is going to prove to be a testing ground for some of the concepts we have in mind for C&C: The Dead 6.  There are going to be some pretty exciting things going on in the next few months, so stick around.

  • Coming Attractions - In Development

    Well, since it has been a few days since my last blog post, I thought I would post up a bit about the coming attractions and what is going on developer wise with RenEvo.

    C&C FarCry

    We have cracked the egg, I finally gave approval to vloktboky to crack open the crygame.dll for modification, on this project I was trying to keep from opening the crygame.dll, but at this point in the project, I don't want to have to try any more lua scripting work arounds.  We are going to go through some of our lua work arounds now and begin optimizing the code into c++, for example, our "dummy AI" will be moved to c++ for speed.


    I just registered the domain "SWGLabs.com" and will be posting up a site for it soon, it isn't high on my priority list right now, but will be expedited the first of the year.  I have brought in Anzel to hopefully help me out of my current dip in coding ideas on how to get around some new packet issues, as soon as we work through those, I plan on making that Phase I release that is overdue at the time of writing this by about 3 weeks.

    Website + CommunityServer

    I am currently working on some new items that I will write an article on, just to preview, it will be on how to create your own data layer similar to the CommunityServer data layer (don't want to recompile already compiled code), create a class for it, then create the front end controls for it.  At first it will be a simplistic data display, then I will move on with articles on some more advanced ideas.  I will keep the Sample Page up and use it for my articles from this point forward.  I had an amazing response to my last article, and plan on keeping them going until I run out of steam on them.

    The Dead 6

    Saving this for last, with all the new staff that C&C FarCry has brought on, we are eagerly trying to finish it up, then move into the Dead 6 project.  It isn't dead, we are actually finalyzing a bit of stuff trying to get our heads squared on the concept, and working between EA and Crytek to keep things in order legally and development wise. At this point we have an "official", "don't try to sell it or steal the license and you can do it" ok from EA, which it wasn't ever about talking to them and them saying ok, but them saying at least that. 


    As some of you may have seen, me and vloktboky have opened up two forums for people to come and ask code questions in, we will do our absolute best to answer those questions, and as we see people who start to take part in helping other we will try to reward them in any way that we can.  Knowledge is power, but without sharing that knowledge you are just as dumb as the guy sitting next to you.

    The holidays

    Don't expect a huge incoming amount of information from us at the site during the holidays, at least from me, I have end of the year software wrap-up, heading home to Indiana for the first time in 6 years, as well as holiday shopping and parties to attend to.  It's a busy time of the year.  I hope everyone has a great holiday, and look forward to a more prominent year of RenEvo coming back to life with a new face and goal.

  • Adding Pages To Community Server

    As some people have seen on this site, I have added an IRC page.  In this blog I will briefly go over how I created the new page, and then implemented it in the site to show up on the tab list.

    First Step

    The first thing that I did was create a new directory and copy the Default.aspx from the root Community Server folder into the newly created folder.  For this example, we will call the folder "Sample".  Next, navigate to the new page http://www.renevo.com/Sample/

    Now that we have our new page, we need to add it to the menu, and get rid of the default content that is on the page.

    Removing The "Home Page" Content

    Open up the new Default.aspx and edit the Page_Load function at the top to remove some "Home Page" specific items.  Remove everything below the UsersOnline.SetLocation("Home"); Modify that string in the function call to "Sample Page".

    Next, scroll down and remove the items inside the <div class="CommonContent">  (you can leave the adpart if you like), next remove all of the components inside of the <div class="CommonSideBar"> section.

    Finally it is also a good idea to remove the tagprefix's and imports at the top of the page that you won't be using, specifically the Blogs.Components, Disucssions.Components, and Galleries.Components imports and the Galleries, Blog, and CSD TagPrefix's.

    Now if you navigate to your page, you will have a blank page to work with.  At the end of this blog is a link to download the default.aspx used in this article, so you don't have to worry about making your own.  This is just an explanation on how I implemented the system.

    Adding Your Own Content Locations

    Inside of the first <div class="CommonContent"> element, lets add a ContentPart, which is editable by double clicking on it to bring up the modal editor similar to the home pages welcome box you get out of the box.

    <CS:ContentPart runat="server" contentname="samplepage" id="samplepageContentPart" text=""&lt;p&gt;&lt;/div&gt;Sample Editor&lt;br /&gt;&lt;/p&gt;" />

    Woh woh woh... hold on a tick... wtf did you just do?

    What I did was create a new asp.net control, specified that it runs on the server, the "contentname" is used for storage in the database, the "id" is the name of the control, and the "text" is the default text if you haven't previously configured any text for it (via the double click edit mentioned previously).

    Alright, lets save it, and navigate back to that page.  Double click the control, and add the following text. "My First Community Server Page".

    Adding Your Own Sidebar

    This step is totally optional, but if you want to create a sidebar for the page to have the same look and feel as the rest of the website, lets add a new sidebar editor.

    Inside the <div class="CommonSidebar"> towards the bottom of the page, insert the following code.

    <CS:ContentPart runat="Server" contentname="samplesidebar" id="samplesidebarContentPart" text="&lt;br /&gt;&lt;h4 class=CommonSidebarHeader&gt;Title&lt;/h4&gt;&lt;div class=CommonSidebarContent&gt;Content&lt;/div&gt;" />

    As you can see, this is the same type of control that we used for the top section, we simply used the html + style of the sidebar for the default.  After you navigate to the newly saved page, you will see a sidebar with the header of "Title" and the inside of that content box will be "Content".  You can double click on this control to edit its text, just be sure to watch your delete key, as long as you don't delete the line between the bold text and the "Content" you will retain your style, if you do edit it, I am sure you are clever enough to figure out how to edit it in html mode.


    Ok, so up to this point we have created a new directory and put our new default.aspx in it, modified the page to look like we want, and added some custom content editor areas to display some text.  Whats next?

    Integrating Into The Web Site


    Here is the fun part, lets open up SiteUrls.config in notepad. Crap that file is huge and confusing... Yes, I agree, it is huge, and it does look confusing at first, but lets just focus on the important parts.  Be wary, you SHOULD create a backup before doing anything in this file, you mess up here, your site goes boom.  Ok, warning of screwing up everything is over, lets move along.

    At the top of the file, you will find a section call "locations", lets create a new location in that sub section like so:

    <location name="sample" path="/sample/" exclude="true" />

    This simply creates a new location for the website, specifies the directory, and that it should be excluded from Url Rewriting Changes. (i.e. static)

    Don't save just yet, you will mess up stuff, people will complain, and your not done yet lazy.

    Next, scroll down to the <urls> tag, and lets add a new url.

    <url name="samplehome" location="sample" path="default.aspx" />

    Yes, that was easy, and all of the tags do exactly what they specify, "name" is the internal name of the url, "location" is the location name that we created in the previous step, and "path" is the file to navigate to in that location.

    So is that it? Not quite, we need to add it to the actual tabs, so hands off that CTRL+S key combo...

    Find the <navigation> tag, and lets add a new link.

    <link name="sample" resourceUrl="samplehome" resourceName="sample" roles="Everyone" />

    That wasn't so hard now was it?  the "name" once again is the internal name of the link, "resourceUrl" is that url that we just setup in the previous step, resourceName is something I will cover in the next step (good to just match it to the link name), and the "roles" are the allowed roles for this page, seperated by comma's.

    Ok, save the file, and lets move on.

    Dude, it didn't work, nothing changed on my page tabs....  I didn't say we where done, lets move on...

    Language Configuration

    You didn't think that we where going to break the multi-language support did you?  Shame on you...

    Lets navigate to <install>/Languages/en-US/ and open the Resources.xml in notepad.

    This file pretty much has all of your english strings for the Community Server in it, there are more, but this is the bulk of them.

    Lets do a CTRL+F and look for "<!-- Main Navigation -->" which shouldn't be that far down, around line 40 or so.

    Just above the <!-- End MainNavigation --> lets add the following line.

    <resource name="sample">Sample Page</resource>

    Touch Web.config

    For those of you that don't know, you will have to "touch" web.config to restart the web application, as resources.xml seems to be read on application_start.  Simply open up web.config in notepad, press CTRL+S, and close the file.  This will force a restart of the CommunityServer web application.

    If you followed my instructions, you will now have a link to "Sample Page" on your portal.

    Hey its there! and it works... but, it doesn't seem to be selected when I navigate to it, it still looks like I am on the homepage.

    Thats why I have the next step!

    Creating Your Own Master

    Lets find a new directory, and head into <install>/Themes/default/Masters/.  Copy the HomeMaster.ascx and name it SampleMaster.ascx.

    Lets open up SampleMaster.ascx in notepad, and modify the following line:

    <CS:SelectedNavigation Selected = "home" runat="Server" id="Selectednavigation1" />


    <CS:SelectedNavigation Selected = "sample" runat="Server" id="Selectednavigation1" />

    Save and close the file.

    That didn't do anything

    Thats because we didn't specify our page to use this master yet, lets re-open our <install>/Sample/Default.aspx in notepad, and modify it to use our new master page. Modify the following line:

    <CS:MPContainer runat="server" id="Mpcontainer1" ThemeMasterFile = "HomeMaster.ascx" >


    <CS:MPContainer runat="server" id="Mpcontainer1" ThemeMasterFile = "SampleMaster.ascx" >

    And now, navigate to your page, and see some new page goodness.


    Ok, we copied a file, modified it, created some new edit regions, added the link, then bound it to a new master theme.  That wasn't too bad was it?

    You can download all of the modifications Here

    Hope this article was helpful, if you have any questions, feel free to ask in the forums!

  • Treo and More

    So, i just recently picked up a Treo 700wx, and I must admit, at first it was a tad overwhelming.  Features, features, features.  After about 2 hours of playing with it though, I am finally pleased with it, and have started customizing everything.  And... I have joined the bluetooth frontier.  I have been reluctant for some unknown reason to join the bluetooth frontier, but after purchasing this phone I had to get the headset for it.  Also, in July 2008 California goes mandatory hands free mode on the phone while driving.  If you can afford the $700 cost of buying it, I heavily recommend picking one up.

    I uploaded a bunch of pics and files to the site this weekend, having some issues with the 3ds max W3D importer, but I will get that resolved and posted soon.  Also I plan on posting up vloktboky's source code stuff this week as well, they are slightly larger files so it will take me a bit more time to get them uploaded. I sometimes forget a lot of the stuff that I have done over the years of running this site, some of it was pretty comical, others pretty shocking.  If you haven't seen what RenEvo has done, you should check out the downloads section for a bit of an insight into the past of what is available.

    This week is a toughie for me, I am building a database versioning system for work and need to get it going pretty fast on it as well as get it completed, hopefully I will have time to work on SWGEmuDotNet and get the first build out for developers to start tinkering with.  If not, I plan on getting some major work done this weekend on it anyway.

    Some serious staff hiring is going on for C&C FarCry and The Dead 6 right now, since they are very similiar projects by gameplay, we can hire staff for both, which is nice, and even some of the items we create can be merged over. We just picked up another modeler and mapper for sure, Oggy and Gypsy, they come from FarmCry a mod for FarCry, it looks pretty interesting, it is a totally different type of mod.  We should be able to reach our beta release finally, which excites me to no end, I really want the community to start playing this mod (C&C FarCry).  Its entertaining, and brings C&C mode to another game on a newer engine then Renegade. Although I think like most other MP mods for FarCry that the Ubisoft server stuff will prove to be a pain, we will do our best to help people out who are having issues getting connected to servers.

    And finally, some more work is going on with tweaking the final bits of The Dead 6, we are fixing and verifying some of the story bits, and are working with some people to get us some tools to start working on the mod for real. You may see a news announcement about this quite soon!

    Thats pretty much it for today, I gotta get back to work, and wait on this damned udpate script to finish executing. I knew the first update would be painful, but it is already 20 minutes into the update, and no sign of hope just yet.

  • Windows Live Writer Support

    So, I just downloaded some kewl additions for the site, one in particular being the Windows Live Writer Plugin.

    I must say that its pretty kewl, I have liked Windows Live Writer since it was announced, and now it just really blows my mind to be able to use it on my own website, and actually keep updates flowing!

    Windows Live Writer

    Community Server Plugin For Windows Live Writer

    Good lord... Am I turning into a blog freak?

  • Community Server

    So, after some careful considerations, I am getting happier and happier with the usage of this system.

    It was a bit clunky from what I am used to using, then again, it does also have the support for a full website instead of just forums.

    I will be grabbing the SDK this weekend to see what comes of it, I am pretty sure with my skills as a .Net programmer I can come up with some sort of graceful methodology for adding some missing functionality.

    Also, the default theme thing is driving me nutz, going to spend some time trying to make this thing work, although it seems that its a big of an undertaking.  Maybe I will just build the theme from scratch, as the skinning process for the site is actually pretty well done.

    Off to work I go, I am sure you will see quite a bit more updates here, especially considering now that I have found a way to blog on the site with my Windows Live Writer.  Just gotta install the plugins Stick out tongue

  • Site Go Bye Bye

    For now anyway, it seems that our server got locked up in a building.  Seriously sucks, the site was starting to get an average of 10 registrations a day, the forum posting was up, etc...

    Then BAM, site goes down, and we may not get our data from the server for a while yet.  No fret, it isn't lost, just missing in action.

    Why did I choose Community Server

    My work is currently checking out this software to host our developer site on, and I wanted to have a live demonstration and get familliar with the system myself, so far it seems all right, there are some quirks I am trying to get around, and the default theme is seriously eh.

    Anyway, I must get home to the wife, going to watch Memoirs of a Geisha on our new Zoom Box

Powered by Community Server, by Telligent Systems