Cell Phones and the Virtual Workplace

by Terry 10/30/2008 11:04:00 AM

Around the office, the hot topic has been the change to a new cell phone provider. It seems most of the phone options basically suck.

One of my coworkers chose the LG CU720 model. His comments are that this model cannot use the wired headset and charge at the same time. If you use a wireless headset, the battery on the headset will eventually die if you are in long phone meetings. Other choices for phones are equally bad for similar reasons. I hope our options improve.

As a ‘virtual worker’ (a term I dislike because it implies I almost work, or what I do almost looks like work) I spend a lot of time on the phone. I am not fond of wireless headsets for a variety of reasons. Having no option except for a wireless headset is cause for concern. Naturally everyone in my workgroup is raising a ruckus. I eventually posted the following reply to the topic.

My preferred phone:
The battery can also be used as a spare for most home-office UPS systems

But for those who really need the latest:

Make Your Old Brick Cell Phone into a Bluetooth Headset With 8 Times the Battery Life

Currently rated 1.5 by 19 people

  • Currently 1.526316/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

computing | Personal

Forgetting what you know

by Terry 10/10/2008 10:53:00 AM

Yesterday I was working on a support issue at work. I was trying to debug some of the server management tools developed by my project team. Our tools, for a variety of reasons, are designed to work when logged on to the console session on a Window 2003 or Windows 2008 server. Naturally, I was rather frustrated when I used Windows’ Remote Desktops client to connect to the console of the server in question, but failed to actually gain a true console session. The console option was being ignored.

After a bit of running around, and with a little assistance from my teammates, the problem server was fixed and I could start investigating why Remote Desktop failed or disabled the console connection when I explicitly specified it. My co-worker, Keith, reminded me that Terminal Services options changed with Windows Server 2008 (and maybe Vista, but I work mostly with Server Operating Systems these days), and that Remote Desktop client support changed to match. I had known this. I needed to know this while updating my tools to work with Window Serer 2008. I had forgotten what I knew and missed the connection. After a bit of research I found this article on MSDN, which is helpful to solve the issue: Changes to remote administration in Windows Server 2008, (KB947723).

Honestly, it does a poor job of explaining why the changes is made, but at least it gives you a workaround – use the /admin switch instead of the /console switch.

I installed Windows Server Service Pack 2 Administrative Tools, as I had the Server Pack 1 version installed. Unfortunately, this does not help. Maybe there is a version that supports the actual options needed. I don’t have the time to look for it at the moment.

In the meantime, my solution is a little less glamorous. I have a custom toolbar on my Start Menu with a folder of links to most of the servers I connect to on any given day. Each shortcut uses a command line like the following example.

%windir%\system32\MSTSC.exe /V:MY-SERVER-01 /admin

Where MY-SERVER-01 is name of the server I want to connect to.

Currently rated 1.5 by 11 people

  • Currently 1.454545/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

computing | Personal

Frustration with RoboCopy

by Terry 10/6/2008 3:08:00 PM

At work I have several tasks which replicate about 50Gb of data across the corporate network. Why is not so important. The requirements of every destination for the replication are unique. Last week I built a new replication to publish data to a UXFS file system destination. My source is a Windows NTFS file share. I use a scripted RoboCopy task to regularly replicate the data.

When I built this new mirrored replication task on Friday I expected the weekend tasks to simply do incremental updates of the destination. I was surprised that every replication was a full file copy. This was unusual, and generally bad.

My RoboCopy commands looked something like this: RoboCopy \\Source\Share \\Destination\Share /MIR /Z /FFT /COPY:DT /NP /NDL

This has basically worked without issue for ages in a dozen different usages. Oddly the /MIR argument was not working as expected and a full file copy was done on every execution; the Source files always showing as Newer.

The short story is, after some lucky Google searches, I turned up this link.
http://forums.buffalotech.com/buffalo/board/message?board.id=0101&message.id=48

Mr. Taylor explains that the file time resolution is different, more precise, in XFS than in NTFS. This makes sense. My destination is UXFS, a derivative of UFS, or so I am told. Nonetheless, the argument was sound so I tried it. Lo, and behold, I have success. My new command line looks like the following: RoboCopy \\Source\Share \\Destination\Share /MIR /Z /FFT /COPY:DT /NP /NDL

I used the /COPY:DT argument for performance reasons. The destination inherits security settings from the share, so removing the check speeds performance; one less check on each file.

This chewed up most of my day trying to figure out the cause and resolution. At least I have a resolution. I just wish it left me time for software development, which is what I actually do for a living. Sometimes.

Currently rated 1.7 by 79 people

  • Currently 1.696205/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: ,

computing | Personal

Tripping on my own feet

by Terry 8/14/2008 2:09:00 PM

Last night at work I made an error I do not think I have ever done in the last ten years of software development. I inadvertently overwrote production software with pre-production test code. I am a big advocate of using source control in software development for any size team, and I intentionally configured the system to force me to use distinct accounts, each with limited access and permissions. This normally serves me well, but not last night, particularly for one portion of my product. This one portion of code I inherited from another team and have had to rebuild it to fit into a new environment and new source control from a completely distinct origin.

I wear several hats at work; software developer, technical architect, system administrator, and change manager. I switch between these roles, sometimes frequently, during the course of a day, like when I release a new version of my suite of products into our acceptance test environment or production environment. Hence the reason for good source control, polished processes and compartmentalized access.

For this one small blob of code, what I have in production is not quite what I have in my source control and development and test environments, nor is any of what I have under source control history a match for the current production code. This is a bad place to be in my position.

As I was configuring a large change in our test environment last night, I made a final update to the test environment. I failed to notice the publication path was the production path (then one path without restricted access yet), which has a similar path as our test environment. I released the code and scratched my head for about three minutes until it sunk in what I had done.

The good news, what there is of it, is the code I released into test last night will complete the synchronization of the inherited code and place it completely into the better managed release cycles. Unfortunately, this new test code depends on changes not yet in production, so it out and out broke.

To make matters worse, as part of another transition, my backups for this one portion of the production environment are not current, and may be gone. I recently change the hosting platform to new servers, twice in the case of the problem code, all with new backups. The old backup files are likely gone, or exceedingly difficult to recover. The new backups are not configured yet.

There is a happy ending. I do have source control and while it did not match exactly what is in production, it was close. I restored to the closest match I could find, made changes to match production functionality, rebuilt and published to my production environment. Life is good again, and life will be better yet in a couple weeks when I release a full version of all related tools with fully synchronized versions. The release processes also includes full backups of all affected parts, so I have a real recovery option should an error occur.

If there is a moral to the story, it is this; Source control is good, backups are better and access control is vital.

Currently rated 2.0 by 1 people

  • Currently 2/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: ,

computing | Personal

Vista Video Editing

by Terry 8/12/2008 1:28:00 PM

I played with video editing last night, building a DVD for a fellow martial artist. Michelle is a Black Belt candidate at DMW Martial Arts. Among the requirements for getting your black belt is providing video documentation of forms, sparring skills, self defense and board breaks. Two reasons for this are to demonstrate proficiency of basic skills and create a visual record to review and look for improvement.

I am not skilled at editing video; I just use what I have. It is not great, but it works. Sort of.

I recorded the video on my hand-held JVC Everio GZ-MG37U recorder. It has a built-in 30 GB hard disk which stores a wee more than 7 hours of 720i HD video. I can plug the camera into my home computer’s USB port and transfer the files and away I go.

The JVC camera records video into an MPEG-2 derivative file format with a .MOB extension. I never really understood why it uses that format. Renaming the file will allow it to play in many players that do not recognize the .MOB extension. However, the files also do not have the correct 16:9 aspect ratio and instead play in an 4:3 aspect ratio, giving the video a squeezed appearance. That was an irritant until my friend Steve told me about SDCopy. SDCopy is a simple utility with basically one useful function; it copies the .MOB files to .MPG files and simultaneously sets the bit-flag on the destination file to indicate it is in wide-screen format.

With the files all coped to an .MPG format with the correct aspect ratio, I opened Microsoft Movie Maker on my Vista PC. I have mixed feelings about Vista. I am frankly disappointed with Microsoft about Vista. But that is another topic. Movie Maker, however, is a disappointment all its own. It works, but is it is not a great.

For what I need though, it works well enough. I have about 21 video clips to basically place titles on, and then make a DVD from it.

This is where it gets really odd to me. To make a DVD with Vista, you save your Movie Maker project and open the Vista DVD Maker. DVD Maker is about the most feature-poor DVD utility I have ever used. Oh, it works, but it does not offer much of anything. Why not include DVD Maker operations in Movie Maker? I mean, ‘Movie Maker’ implies it makes movies…

It is important to understand that DVD Maker is pretty feature poor at this point. My 21 video clips had to be saved into 21 Movie Maker project files for the DVD Maker process to work close to what I would hope. It took me about 45 minutes to same all the Movie Maker projects with the correct titles. Thankfully, I was not concerned much about editing. It was a fast process and I got into a routine of creating a project, inserting the clip, title the video and save the project – lather, rinse, repeat.

Then I open DVD Maker. I insert the 21 Movie Maker Projects and sorted them into my preferred play order. I spent a few minutes getting the menu options selected, saved the DVD project and told it to make the DVD. About forty minutes later I had a fresh new DVD and I am good to go. I just did not get what I expected; scenes.

Scenes are where the DVD video is divided into segments. Watch most any DVD movie and you will find a menu option on scene or chapter selection. DVD Maker takes each Movie Maker project as a separate scene, which is exactly the behavior I wanted; a simple menu to play the entire sequence as a single movie, and a scene selector to get to specific clips.

DVD Maker has no scene editing capabilities. You get what it gives you. It gives you about 18 scenes, maximum. Remember I have 21 scenes. DVD Maker essentially combines some clips together, retaining the detailed order, to give you 18 scenes. When I first looked at the DVD menu on my home DVD player I had a moment of fear thinking the video lost a few important clips. It turns out they are still there, but you have to select a scene before where you may be looking, then play that and skip ahead in the video.

Overall, for a quick and dirty bit of video production, it got the job done. The JVC Everio camcorder has grainy video when used under florescent lighting and my DVD has somewhat confused scene options, but the task is done. If I need to I can take it to the next level and fix it using other editors, although I do not know what I would use, perhaps a Mac. The JVC software I have is buggy but I may try that before I run out and buy a new workstation. Either way, Michelle has her video requirements complete.

Currently rated 1.4 by 28 people

  • Currently 1.428571/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , ,

computing | Martial Arts | Personal

BlogEngine Bugs

by Terry 7/30/2008 11:15:00 AM

I updated to BlogEngine 1.4.0.0 last week. That was more work than I hoped but overall has been a good move, not because BlogEngine is any better, but because it forced me to update my web hosting accounts. Now that the host updates are complete, I can look at using BlogEngine more. So far the 1.14.0.0 version has annoying bugs in the posting control. I lack the time to research it all, and I am only slightly tempted to use the intermediate build versions.

The most annoying defect is that none of the script-driven methods work correctly on the posting pages (Add_entry.aspx). The HTML view and link add/edit functions do not work. It is painfully difficult to edit or format my journal entries. This morning I resorted to updating the post with the plain text, then editing the XML data file and inserting the content in the way I like. Ugly.

I have research and fixes to implement and no time to do it. I am not sure if it is a permissions issue on my host or bad script in the pages (I suspect the later). Double ugly.

Currently rated 1.5 by 6 people

  • Currently 1.5/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , ,

BlogEngine.NET | computing | Personal

If you have nothing good to say…

by Terry 7/23/2008 10:29:00 AM

As kids we have all been told, “If you have nothing good to say, say nothing at all.” I have so much to complain about since Saturday that I am almost at loss for good things to say. But, I will breathe deep and try.

If you read my posts or subscribe to my feeds, my apologies for the garbage posts. I periodically check my hosting provider to see what is offered. I upgraded my web space with better features and services for a substantially reduced cost yesterday. This is great for me and everyone. Apparently my hosting provider made an error and updated the DNS settings to the new site before I had actually built and configured the new location. If you see this post I have corrected the issues and you are looking at the new site. It was not the smooth transition I have executed many times before.

This was all really kicked off when I upgraded my site to use BlogEngine 1.4.0.0. My hosting provider has some odd configuration settings on some of their platforms. BlogEngine has an ‘Admin’ folder for blog management (I hate the term blog, and use “Journal” where I can). To implement BlogEngine on my site I had to rename the Admin folder to a ‘safe’ folder name. I chose ‘journal_admin’ and updated all the source code to use that folder structure. Not a difficult task, but annoying when I want to apply an updated.

By changing my host platform I no longer need to make this change to the ‘admin’ folder in the BlogEngine source. The new platform allows me to have folder names of my choice, which is as it should be. The garbage on my ‘new’ site was test code to make sure I could fully implement BlogEngine source without any change. No one was supposed to see that, however. Such is life.

Currently rated 4.5 by 2 people

  • Currently 4.5/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , ,

BlogEngine.NET | computing | Personal

BlogEngine Update

by Terry 7/17/2008 3:41:00 PM

I updated my site this morning with the latest version of BlogEngine (1.4.0). This took me about 90 minutes. The challenge is my web host provider does not allow an ‘Admin’ folder at the root share on the server. I do not mind this so much, but I have to search for every reference to the Admin folder and replace it with a different location that work with my site.

Once I did that, the conversion was generally painless. My current web theme still works, but I need to update my site master to use my custom controls. At the moment I do not have the time to do this. Anyway, BlogEngine 1.4.0 appears to work.

The one problem I did run into was with an error on the ProfilesCommon class. I kept getting the following compile error when publishing:

The type or namespace name 'ProfileCommon' could not be found (are you missing a using directive or an assembly reference?)

I found another post with the same error at http://developerblog.net/post/2008/07/ProfileCommon-could-not-be-found.aspx. I did the same thing and deleted my site and republished. All is now well. Go figure.

Currently rated 5.0 by 1 people

  • Currently 5/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: ,

BlogEngine.NET | computing

A CodeBehind error with User Controls

by Terry 6/11/2008 3:29:00 PM

I am updating a website for my project at work. It was a .Net 1.1 site converted to a .Net 2.0 site. We are adding some new user controls to the site. I implemented one of those controls on a new page and generated a runtime error like this:

"The base class includes the field 'WebUserControl1', but its type (common_WebUserControl) is not compatible with the type of control (ASP.common_webusercontrol_ascx)"

Hunting around with Google turned up several possibilities, but none helped until I found this link, http://forums.asp.net/t/960707.aspx. A post by ‘Maduka’ suggested changing the Codefile attribute of the page to CodeBehind. This worked in my case. It works the other way too. Change the control’s CodeBehind property to CodeFile.

Without digging deep, I would presume that CodeFile is the .Net 2.0+ code behind method, as this is the default value for the new user controls. So, in our implementation, we are changing pages that use the new controls to use the CodeFile attribute instead of CodeBehind. I am not sure which is best practice, but this is what works for me. Your mileage may vary.

And, it turns out I may have this backward. Reading the posts a bit more indicates that CodeBehind is the new model. I will need to look at this more closely. I am hoping to update the site to .Net 3.5 by the end of the year, and I would like to clean this stuff up.

Currently rated 1.5 by 111 people

  • Currently 1.459461/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: ,

Coding Practices | computing

In Atlanta, Georgia, June 2008, Training Day 3

by Terry 6/4/2008 11:37:00 AM

Today is my last day of Opsware/HP training today. I am learning how to use Server Automation System (SAS) 6.5. At this point I now know enough to be dangerous. When I return home I will begin using the test environment which is being configured at the office at this time.

The challenge before me and my coworkers is taking what we have learned and what we know about our existing environment and making a first draft WAG at how to define our new system and processes. This is not an easy task and will likely cause more than one of use to have a meltdown.

Another challenge is remembering everything we know long enough to be useful. The successful implementation of change requires an understanding of what is new. Understanding the new takes time, practice and patience, and managing the expectation of instant answers. We will not have answers initially. When the answers do come, they will likely vomit forth in a sudden rush. Those waiting for the answers should have there shields up as the cycle of information processing begins anew with the receiving end of the flood of knowledge.

I look forward to returning home and starting into all this.

Currently rated 1.5 by 57 people

  • Currently 1.508773/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: ,

computing

Powered by BlogEngine.NET 1.4.5.0
Theme by Mads Kristensen

About Terry Losansky

Terry Dee Losansky

I am a software architect, actively practice and teach martial arts and live in Snoqualmie, Washington. I have an amazing daughter who is the jewel of my life.

E-mail me Send mail
Terry's Facebook profile

Calendar

<<  November 2017  >>
MoTuWeThFrSaSu
303112345
6789101112
13141516171819
20212223242526
27282930123
45678910

View posts in large calendar

Recent comments

Authors

Disclaimer

The opinions expressed herein are my own personal opinions and do not represent my employer's view in anyway.

© Copyright 2017

Sign in