03 Oct 2013

# Munin, mysql and semaphore: how to deal with the "identifier removed" error

History / Edit / PDF / EPUB / BIB / 6 min read (~1004 words)

The following depicts how I “solved” a problem I recently had regarding munin, its mysql plugins and the shared memory cache library used by the plugins (written in perl and using IPC::ShareLite).

First off, let’s begin with a description of the problem. I posted the following on serverfault.com in hope I’d get help from someone more experienced than I am.

I’ve recently setup a munin-node on a CentOS server. All was working fine until I tried to add the apache plugin (which works fine).

For some odd reason, the mysql plugins for munin that used to work ceased to work… I’m now getting a weird error whenever I’m running the plugin with munin-run. For instance

munin-run mysql_files_tables

returns me

IPC::ShareLite store() error: Identifier removed at /usr/lib/perl5/vendor_perl/5.8.8/Cache/SharedMemoryBackend.pm line 156

but sometimes it will also return

table_open_cache.value 64
Open_files.value 58
Open_tables.value 64
Opened_tables.value 19341

but after a while it will revert to the previous error.

I do not have any knowledge about the IPC or the ShareLite library so I don’t really know were to start looking. Since it is a module related to shared memory, I tried tracking down shared memory segments with ipcs without much success.

I haven’t yet rebooted the machine as it is used for many projects (I’d obviously like to be able to diagnose the problem without requiring a restart if it was possible).

Has anyone faced this problem? (a quick search on google didn’t present any relevant help)

Thanks for the help!

Obviously, one can see quickly that this is a quite specific question that not many may have actually encountered. Thus, I didn’t expect to receive much help out of it (and I didn’t).

I had left this issue on the side for a couple of days hoping to come back to it at some point. Munin and the mysql plugins were installed on two servers and it was working fine on both of them (and a third one as master node). After a minor change, one of two client nodes stopped working correctly while the other was still fine. After a couple of days though the second server also decided to exhibit a similar issue…

Tonight I remembered about strace, which is pretty awesome in circumstances like this one. I went ahead and launched strace munin-run mysql_files_tables which outputted a lot of stuff and then stopped at the following point:

...
ioctl(4, SNDCTL_TMR_TIMEBASE or TCGETS, 0x7fff13da8e30) = -1 ENOTTY (Inappropriate ioctl for device)
lseek(4, 0, SEEK_CUR)                   = 0
read(4, "# Carp::Heavy uses some variable"..., 4096) = 4096
brk(0x163e7000)                         = 0x163e7000
read(4, "\n    redo if $Internal{$caller};"..., 4096) = 1737
close(4)                                = 0
write(2, "IPC::ShareLite store() error: Id"..., 123IPC::ShareLite store() error: Identifier removed at /usr/lib/perl5/vendor_perl/5.8.8/Cache/SharedMemoryBackend.pm line 156
) = 123
semop(14581770, 0x2ab08bb67cf0, 3

and when it is actually fixed, the application would end instead (outputting a bunch of stuff such as the following)

...
stat("/usr/lib64/perl5/auto/Storable/_freeze.al", {st_mode=S_IFREG|0644, st_size=706, ...}) = 0
stat("/usr/lib64/perl5/auto/Storable/_freeze.al", {st_mode=S_IFREG|0644, st_size=706, ...}) = 0
open("/usr/lib64/perl5/auto/Storable/_freeze.al", O_RDONLY) = 4
ioctl(4, SNDCTL_TMR_TIMEBASE or TCGETS, 0x7fffe7223570) = -1 ENOTTY (Inappropriate ioctl for device)
lseek(4, 0, SEEK_CUR)                   = 0
read(4, "# NOTE: Derived from ../../lib/S"..., 4096) = 706
close(4)                                = 0
semop(917514, {{1, 0, 0}, {2, 0, 0}, {2, 1, SEM_UNDO}}, 3) = 0
semop(917514, {{2, -1, SEM_UNDO|IPC_NOWAIT}}, 1) = 0
semop(917514, {{1, 0, 0}, {2, 0, 0}, {2, 1, SEM_UNDO}}, 3) = 0
shmdt(0x7fc30021f000)                   = 0
semop(917514, {{2, -1, SEM_UNDO|IPC_NOWAIT}}, 1) = 0
...

#### How to “fix” the problem

What you can see in the first output above is pretty interesting. The semop call gives you the semid the process is trying to obtain (the semaphore used to synchronize different processes using the same shared memory). The signature of the semop function is as follow:

int semop(int semid, struct sembuf *sops, unsigned nsops);

where
semid: semaphore id
sops: pointer to a sembuf struct

struct sembuf {
u_short sem_num; /* semaphore # */
short   sem_op;  /* semaphore operation */
short   sem_flg; /* operation flags */
};

nsops: the length of sops

Upon first inspection, you can see that the sembuf in the first case seems to be invalid if you compare it with the working version where it is actually resolved (strace displays something such as {{2, -1, SEM_UNDO|IPC_NOWAIT}} instead of 0x2ab08bb67cf0. But that is not helping me much.

With that semid you can do two things: first, you can check if it is still alive by calling ipcs, second, you can remove it with ipcrm -s semid.

In my case the “fix” itself was to remove the semaphore that the plugin wasn’t able to obtain (the reason of this still elude me though). After the removal of the semaphore, it is possible again to run munin correctly and the identifier removed error is gone.

I will have to do more research as to how/why this issue occurs as I’ve seen it happen only on CentOS machines so far (the master server is a Debian machine).

18 Sep 2013

# Accepting self-signed (or unknown) certificates with TortoiseGit

History / Edit / PDF / EPUB / BIB / 2 min read (~322 words)

For some odd reason it seems that the system designed into TortoiseGit doesn't allow the user to interact with git when it requires user interaction. For instance, accepting self-signed certificated is not possible, which gives you the known issue 318.

As of TortoiseGit 1.8.5.0, it is still not possible to accept certificates through the GUI. But it is possible to get your git repository to work with TortoiseGit (and work with the required certificate).

You will need to have msysgit installed and available in your PATH for the following to work.

The first step is to run some git command, for instance git clone https://myserver.com/depot via a command line so that git may auto-accept (or ask you to accept) the certificate. This step is crucial to get the certificate details saved onto your machine.

What you will want to do next is go to C:\Program Files (x86)\Git\.subversion and copy everything into %USERPROFILE%\.subversion. Basically, this should copy the certicates that were accepted by msysgit so they can be used by TortoiseGit.

Another, and possibly better solution, is to create a symbolic link so that those 2 folders are in fact a single one. For instance, you could do something such as

move %USERPROFILE%\.subversion %USERPROFILE%\.subversion_backup
mklink /D %USERPROFILE%\.subversion "C:\Program Files (x86)\Git\.subversion"

which will make %USERPROFILE%\.subversion point to your C:\Program Files (x86)\Git\.subversion folder. This has the benefit that any future certificate will work both for msysgit and TortoiseGit.

Thanks to Mexx’ C# Corner for pointing out the solution.

05 Jan 2013

# 2012 ManicTime review

History / Edit / PDF / EPUB / BIB / 8 min read (~1572 words)

As 2012 ended, I wanted to take a look back at this year and review my computer usage/consumption in order to reduce time wasting activity. That time should be channeled into more meaningful activities like learning a new language, improving my current skills, practicing piano and more.

The following data has been collected from August 11, 2012 to December 31, 2012. There is about 14 days which do not have any data (application was closed).

The data covers my computer usage with over 966.21h of active usage. During the period for which I collected data, the computer was also powered off for 1686.30h and left unused (away) for approximately 58.04h.

If we account 2 months of 31 days + 2 months of 30 days + 20 days (August) - 14 days without data = 128 days of data. This would average to 7.55h/day of active computer usage. The way it is currently "structured" however is that computer usage during the week is about 4-5h/day while on the week-end, it is about 12h/day.

This sounds a bit high, but there's a reason to this. I'm not ACTIVELY using the computer for all that time. In ManicTime, the computer is considered active if the computer is being used at least once within a 60 minutes time frame. This means I could potentially be using the computer for 1 minute (or less) every hour and it would count as an active usage of 1h. But for the sake of this review, I'll consider myself as a computer addict (which I am) and will count every minute as an active minute.

The following top 10 items accounts for 876.18h out of the 966.21h of active usage of the computer.

 Application Hours Google Chrome 457.76 Remote Desktop Connection 179.11 League of Legends (TM) Client 72.69 VLC media player 62.9 HexChat 35.81 Free Alarm Clock 15.02 mRemote 14.9 Sublime Text 2 14.56 Windows Explorer 12.13 Torchlight II 11.3

On first sight we can see that I spend a lot of time browsing the web. I do various things on there and since it is the biggest chunk of my time, it is worth looking at what I do exactly on the web. The following table is the top 10 websites I've spent time on.

#### Web activity

This covers 259.15h out of 457.76h (56.6%) spent in Chrome. This means that I have a long-tail (a list of many different websites which I visit for a brief period) of 198.61h. The major time consumer here is www.reddit.com, which accounts for 36.2% of my time browsing. Even though reddit is a news/media website (useful for staying up to date with world events, not sure I do that...), it also contains a lot of content which I would categorize as time wasters: funny pictures, pictures of cats, videos as well as discussions about topics of interests (computer science, software engineer, robotics, electronics, etc.). I spent about 1.3h/day during the 128 days for which I collected data, which I find to be quite a lot.

As for the other sites, here's a couple of notes:
www.jolteon.net: This is a bug tracker I use to track new features/bugs where I work. It serves as a personal system for me to track these issues. I use it frequently to update task statuses as well as enter anything that I may have forgotten to add during the day. I also like to review it frequently to remind myself of what is left to work on (and let my mind figure that out while I sleep)
www.youtube.com: I often get on youtube because of reddit. I enjoy watching documentaries which last from 30-45 minutes on average.
docs.google.com: I've spent some time writing documents in Google Docs simply because it allowed me to share them with others so they could review my work.
-confidential-: This is a website I use to manage "things" for work. I generally go there every day and it takes me from 5-15 minutes on average.
www.twitch.tv: Watching streamers of Starcraft 2 and LoL for a while.
www.google.ca: Looks like I spend a lot of time searching...
en.wikipedia.org: Whenever I don't know something about a subject of interest, wiki is a good source (generally...)
www.facebook.com: Checking that everyone I know is still alive

#### Back to the apps

If we go back to the applications I use, the next in the list is Remote Desktop Connection. I use Remote Desktop Connection to connect to my PC at work so that I can do some work from home. As you can see, I have spent almost 1.44h/day working remotely. Considering that I am not a "work at home" employee, I find this to be outrageously high. I would like to see this be as close as possible to 0h/day.

Next is League of Legends (TM) Client. I've recently been interested in the game and started to play it on a more regular basis. I would like to keep this at around 1h/day or lower.

I've used VLC media player to watch series as well as movie on my PC. Series are 20-45 minutes while movies varies from 1h40 to 3h40. I'd say that about 225h/year looks like an acceptable amount of time spent on this.

I've stopped using MSN to chat with friends. My main communication channel is now through IRC, for which I use the HexChat client. I want to spend a maximum of 1h/day on communication though.

I'm not too sure why Free Alarm Clock is part of the top 10. I believe this has to do with the note I wrote at the beginning mentioning that ManicTime would consider a program active if there was some movement on the screen in the last 60 minutes. Since Free Alarm Clock was set to show up (and take focus) every hour, it is quite possible that it simply appeared from time to time while I was away and "sucked" the time out of whatever was running in the background.

mRemote is another application I used briefly to do remote desktop. Since it doesn't support multiple monitor remoting, I've stopped using it.

Sublime Text 2 is my text editor of choice. I haven't spent a lot of time in it since August mainly because I haven't been doing any coding at all in the past few months.

Windows Explorer Some time spent searching for files on my PC!

I've played through the whole campaign of Torchlight II, which was pretty awesome! I'd be really happy to try multiplayer with some willing friends to see what the end content is like (single player end of game content was pretty funny, but playing it alone wasn't very satisfying for me).

#### Conclusion

The important part of this process, other than reviewing what time was spent on this year, is to decide new objectives going forward. This means deciding what should be cut down, reduced, increased or added. For each application I've already determined what was my goal/limits thus I simply need to make sure I follow them. A monthly review should be sufficient.

#### Recommendations

• Reduce reddit usage below 1h/day
• Reduce/cut time spend doing remoting for work
• Redistribute free time on learning activities and skills improvement
05 Jan 2013

# Short SSD setup guide

History / Edit / PDF / EPUB / BIB / 1 min read (~143 words)

Backup everything. First rename, then remove when you confirmed it worked. Better safe than sorry.

REM Set the new root location
REM Ex. NEW_ROOT=D:
REM     NEW_ROOT=D:\Data
REM     NEW_ROOT=D:\MyStuff\MyComputer
set NEW_ROOT=D:

REM (Make sure that C:\Users doesn't exist anymore)
move C:\Users C:\Users.Backup

REM DO NOT DO THE FOLLOWING
REM (Rename C:\ProgramData before creating the link)
move C:\ProgramData C:\ProgramData.Backup

REM Update the TEMP folder at system level
setx /m TEMP %NEW_ROOT%\Windows\Temp
setx /m TMP %NEW_ROOT%\Windows\Temp

move C:\Windows\Temp C:\Windows\Temp.Backup
REM END DO NOT DO
05 Aug 2012

# PHPUnit - No coverage report

History / Edit / PDF / EPUB / BIB / 2 min read (~387 words)

I was trying to get PHPUnit to give me some coverage report for a project I had not worked on for a long time. I had received a github pull request from someone and I wanted to see what the coverage was on the project to see if the submitter had done a good job of covering his code, but I couldn't get PHPUnit to generate a report that contained any data. All I would get was a couple of empty directory folders pages, which was useless.

I had code coverage work on another project in the same environment I was in, so I was pretty sure that my problem had to do either with how I had setup PHPUnit for that particular project, or that something else was interfering with the report generation.

I tried a couple of things, starting by calling phpunit from the command line using different arguments:

--coverage-html report test\symbol_test.php

Would generate some report with data in it, good!

-c test\phpunit.xml (logging set in phpunit.xml)

Would generate an empty report, not good...

--coverage-html report -c test\phpunit.xml

Would generate an empty report, not good...

So at that point I saw that it was working correctly and that something was definitely wrong with my phpunit.xml configuration file. I went back to the phpunit.de manual, specifically on the configuration page, and tried to figure out my problem.

For code coverage to be included in your report, you have to add a filter, be it a blacklist or a whitelist, but you have to have a filter.

So I quickly added a filter such as

<filter>
<whitelist>
<directory>../</directory>
</whitelist>
</filter>

which would whitelist everything that is in the project (my project root is one level above test). Ran phpunit in the test folder and I finally got a report with data!