Thursday, February 17, 2011

Scaling up image stitching

In summary, find the Python program I wrote here (works, but work in progress), you'll need the entire repo though:
https://github.com/JohnDMcMaster/pr0ntools/tree/master/stitch

Now that image capture is getting reasonably automated, stitching is the next bottleneck for mass scale IC -> netlist conversion. The Visual6502 team is working at scaling up their image -> netlist conversion. I recently got in contact with them and am hoping to try to get more involved. In the meantime, I suppose I'm a turbo nerd and just even looking over the layouts.

Knowing that Visual6502 had the best images, I managed to convince Christian Sattler to give me his stitch code and get it under an open source license which I somewhat arbitrarily called csstitch. You can now find it here along with some of my patches. Unfortunatly, I quickly realized that the high quality imagery from the confocal microscope had simplified a lot of the stitching. For example, no photometric optimization was being done and it was based off of autopano-sift-c (SIFT based), which I've always gotten far inferior results compared to autopano.kolor.com (also SIFT based, which I call autopanoaj since that's the author's initials and autopano is too vague). From what I can tell, autopanoaj's secret may be that it has a very good outlier detection algorithm. If you turn it off, it produces many very poor control points (features). I've also been playing around with panomatic (SURF based). My general feel has been that the quality is less than autopano-sift-c, but I haven't had enough time yet to give it a fair trial.

Having this experience and getting some ideas from csstitch, I had dabbled at making my own higher performance stitching app. With the CNC producing very accurate XY coordinates, it seemed I could heavily optimize the control point finding process. Unfortunately, there turned out to be a bunch of gotchas along the way. Some of them are due to some oddities of the .pto format, some of them due to the fact that I run autopanoaj under WINE (yuck...) since I don't want to run Windows and the Linux version is out of date.

The first step is to arrange the images into a rectangle. Since the Python Image Library (PIL) and .pto like the origin at the upper left, this seemed the natural coordinate system. At first I tried lower left since that's what I was taught in math class, but quickly realized this was a bad idea and converted the code to use upper left hand coordinate system. I added a series of flip options so that as long as you started in some reasonable grid layout, you could flip it to the upper left hand corner convention. I also pre-process the images with something like "find '*.jpg' -exec convert -strip {} {} ';'" to get rid of accelerometer data and other stuff that I found over-smart programs used to mess things up. For example, gthumb will flip images based on this and made me arrange the images wrong. Anyway, start by getting them into some intuitive grid and then flip them as mentioned earlier:

And I had a picture demonstrating flips...but don't know where it is. In any case, these pictures are already in the correct order above, but are not named correctly for the column/row convention. I might allow parsing rows first to make the above arrangement possible. If you add a transpose, the image matrix is arranged correctly.

Next, it generates all of the adjacent image pairings (as a generator). The images are cut down to only stitch on a common overlap area. This cuts down processing time considerably, reduces false positives by limiting where matches can be placed. However, we've added some complexity with merging project files, discussed later. Image pairs look something like this:




A lot of the distortion you see I originally thought was due to camera-lens or other similar alignment. I eventually realized it had to do with the non-uniformity of my light source. It has a diffuser filter wheel which seems to have helped a lot. I also put it more off center which decreased intensity, but made the light more regular. In any case, it should be obvious from the above images that photometric optimization is a must for my images.

Next running autopanoaj under Linux required some magic. First, it doesn't behave well to a number of file related options, possible due to WINE imperfections. The only way for it to reliably work is to let it generate its own project file(s) by running it without any file options in the dir you have the images and want the project file(s) in. This requires post-processing to convert the WINE file paths to Linux file paths for the image names.

After that, projects are combined with pto_merge. While autopanoaj produced fairly lean projects, pto_merge seems to shove a bunch of junk in. This was creating some issues, so I decided to filter a lot of this out.

Finally, I do some post processing to get things closer to the final output. This includes changing the mapping to rectilinear and changing variables to only d (x) and e (y) optimization. Currently, stitching has to be finished in the GUI. This should be fixed if I can eliminate more control point gaps by image processing.

.pto documentation is surprisingly scarce among panotools. I don't know if I'm just not looking in the right place. I eventually realized that the suite that pto_merge is part of has some good documentation and was quite happy to find good documentation on the .pto format. Would be something good to add to the panotools wiki. I just requested mailing list membership and might bounce some of my ideas off of them.

One of the issues is that some of my images are of such poor quality that RANSAC / min result thresholding rejects the control points entirely. This is usually due to a blurry image. Example troublesome pair:



If I remove RANSAC, I can get it to generate a very poor match:


After a suggestion from someone, I played around with Kolourpaint image transforms and observed that softening the images (blurring sorta) causes the features to be uniform in both and can successfully generate accurate control points. Although the images look somewhat different, the control points are still in the same location on the original images. Example transformed images:



Wow! What an improvement. The new set was ran with RANSAC since it generated so much better data. I have yet to figureout how to implement an equivalent transform in Python, although I did some preliminary tests with ImageTransform.* and haven't tried very hard yet.

I've also been working on fully decoding a CD4011. Hopefully I'll have a write-up of that soon. In some ways, large CMOS designs are easier than small jobs because standard logic cells and other components have to be more regular for large designs to scale. That aside, the schematic is rather simple and other factors make the number of permutations fairly small. The main issue has actually been how to record the circuit nicely. My main choices so far seem to be the visual6502 Python GUI and Gimp. I've found GIMP not so user friendly, although I hear its not so bad once you get use to it. I'm not sure if any pictures of the layout editor have been published and I don't remember any restrictions not to publish them, so for those of you who have never seen the layer editor:


I now work for an aerospace company, Skybox Imaging where i'm now starting to learn about rad hard parts. Don't think I can get any from work, but if someone happened to have something, it might be fun to image and compare to other parts (maybe not be able to publish if ITAR issues though). Finally, someone suggested I submit something to kickstart, so I figured why not. Better toys, better research.

EDIT:
Kickstart rejected me, was worth a try. Looking back over tools, Degate is really what I should be using. I played with it a little and and if I really want simulatable results, I'll look into writing a plugin for it to export to visual6502 jsim format. Its too bad I lost the Fairchild CD4011 I had, it looked a lot more like textbook CMOS I had seen. I only got initial images from it and then lost it. Since then, I've gotten better (plastic now) tweezers that tend not to slip and launch things.

Thursday, January 27, 2011

Metalurgical microscope CNC

I now own what is possibly the worlds first combination CNC milling machine and metalurgical microscope.
I've realized among things, I just like looking at dies to admire the work put into them. Unfortunately, its a lot of work to take the many thousands of pictures required to get a good level of detail on even something like a 386. Plus, if you want the whole circuit, you need to repeat this for many layers.
Fortunately, I have some background in robotics and since I'm planning on getting a better microscope in the next few months, I don't feel bad being a little more aggressive with my current setup. A Unitron N, the model I have, is suppose to look something like this (image from http://microscopesonline.info/):
The z-axis gear got partially stripped at one point when I was trying to fit a shim as mine was missing. One thing in particular I hated though was the upside down sample mounting. I usually used post-its or similar to hold the dies to a drilled out petri dish. Which of course brings to the next annoyance that its also annoying to even get something mounted onto the stage at all.
Not too scared to be a little aggressive then at my half loved contraption, I got this after a few modifications:
See a crude video of it working here.
Some time ago I ditched the polaroid setup since I wasn't going to use that in any form. Next, I mounted the microscope upside down on some t-slot aluminium to make it much more convenient to view samples. Next, I wanted CNC control and I didn't really like the XYZ set-up anyway, so I replaced the XY with my Sherline 2000 CNC XY stage. Turns out, the CNC head can also still fit, but I didn't have it there during early testing.
The Z axis was a bit trickier. An earlier picture that shows the basic idea:
Also you can see I had to tape the eyepieces in so they wouldn't fall out. At first I tried to figure something out with my rotary table since it was the only other heavy duty CNC equipment I (thought I) had. I also had a Z stage for optical work, but the thumb screw was very hard to turn and adapting a servo would be difficult. The dimensions were also awkward to actuate it using the rotary table. I eventually realized I had a CNC micrometer from half of a UV-VIS spectrometer I found and scrapped at RPI. The brackets were close enough to easily adapter to the XY t-slot with an l-bracket. The sample tray base is a largish l-bracket which I've attached several different holders on to experiment. Ultimately I'll probably replace it was a kinematic mirror mount so I can correct tilt errors easily. An early test was to instead use a largish petri dish for the same purpose, but I found that Z axis movement tended to move things around too much. I should still try to couple it tighter to the main axis to reduce vibration, but it doesn't seem suitable enough unfortunately. Finally, the original set-up depended on gravity to remain stable. To compensate, I have it tightened with a rubber band:
The rubber band goes around the brass part which was suppose to be pressed against the shaft by the weight of the equipment mounted to it. As its been turned on its side, this is no longer true. At some point I might see if I can make some more proper spring loaded replacement.
One issue came up was that although you still can view through the eyepiece, its pretty awkward. With the camera over one and not wanting to re-adjust, it becomes difficult. So, I wanted to get the view onto a computer screen which is probably nicer on the eyes anyway. A 1/8" audio style jack breaks out composite where I convert it to an RCA type plug so it can go into my composite -> VGA converter box. The VGA then goes to an LCD display that was affixed to the t-slot. The second display behind the first, possibly not obvious in the above image, was arbitrarily fixed there to get a display up on a media server nearby and get the screen off of the floor.
The camera is mounted on t-slot aluminium as well. My Canon SD630 doesn't have a remove capture cord port and USB only supports PTP, so there is no built in way to do remote capture. So, I removed the top cover and soldered some wires onto the capture button. There are two spots: focus and snap. Shorting snap by itself is not enough to take a picture, focus must be depressed first. A DB25 breakout box runs to some optoisolators to short the signal. I figured out the correct polarity by using a volt meter on the leads coming from the camera button.
Electronics hardware is very simple. DB25 goes to a breakout board and then continues onto the stock Sherline driver box. I made a simple adapter to use the Vexta motor on the a axis with the Sherline box. The camera driver circuitry is very minimal:
The unused IC there is a CD4050 buffer I was going to use on the parallel port. I got lazy and didn't wire it up as the parallel port was already putting out near 5V.
Finally, there are several pieces to the software. At the core, I'm running EMC2. I set the step speeds and acceleration low so as to try to discourage the sample from vibrating. The camera is actuated from M7/M8 (coolant mist/flood) and then reset with M9 (coolant off). I use dwell instructions to give the camera enough time to take pictures, the necessary length of which I'm still working out.
The second part of the set-up is the software that generates the g-code. I wrote a Python program that you can find on my pr0ntools github repo. Its very crude currently, but may be sufficient. It assumes you are scanning a rectangle. One point is assumed to be the origin and the other is supplied on the command line. In order to make a plane, I assume the most level plane you could form from those. I'm currently always starting scans from one side on the theory it might make backlash issues less, but I'm not sure if it matters.
The test wafer I used looks like this:

An interesting piece in its own right, from what I can tell it came from an Intel Journey Inside the Computer educational kit. Of course, I didn't scan the entire wafer, but just one chip. While the plastic distorts the image, it does make a good test as its easy to level and rotate.
From a combination of my lenses are kinda dirty (probably can fix this or upgrade to modern optics, think it uses DIN components), not washing the wafer holder, and the plastic layer, the first pictures came out relatively nice. Why it may not produce quality images like the visual6502 team or Flylogic team does, it should serve to efficiently create a number of relatively high resolution shots to my hearts content. As I get a better microscope, I might also look into CNC retrofitting it, but more likely I'll focus on improving this one as better microscopes are current beyond my budget as a high risk project.

Wednesday, December 29, 2010

Berlin and CCC/Berlinsides

Just wanted to say I'll be in Berlin for a few days if anybody wants to say hi.
Lab is moving and I'm on the move, but I should also finally get settled and $$$ flowing in in the next few months, stand by for cool stuff.

Tuesday, November 30, 2010

Sulphuric acid decapsulation

Something I've been meaning to try for some time. Somewhat arbitrarily I decided to go analog, the victims were a 741 op amp and a 555 timer. The torture:
The victim (I snapped the pins off since they are easy to remove and had more of an impact of nitric reactions):
Initial setup:
I didn't take a picture of this, but the solution started to turn brown and diffuse out around the IC before too long. Began to turn darker:
And eventually black:
The acid behaved differently in the 555 flask (misty, no creeping along sides in 555 vs creeping and no mist):
After draining and some initial acetone rinse:
The die is seen in the leftmost object. I've been told that sulphuric is useful for live decapsulation and it certainly shows here. Much of the "wiring" was preserved despite prolonged exposure to acid. Nitric on the other hand would have obliterated these. Not as visible, but also all of the bond wires were preserved.
Since they were both analog ICs made by ST, it less likely that they were different epoxies. Both used fresh acid. Probably due to some contamination in the flasks.
I'll try to update with some IC pictures. Nitric tends to leave a lot of residue. This on the other hand had overall clean dies, although one of them had sort of a grainy appearance, maybe from certain residues? Apparently the 555 didn't have a passivation layer and the 741 did which resulted in scratches on the 555 after I was careless during plucking and didn't realize it wasn't protected.

In summary, this is what I thought
Advantages
-Less fumes than nitric acid, MIGHT be safer with less equipment / ventilation. With the cover on my glassware and the top being somewhat cold from the environmental temperature, it seemed to reflux the acid and I didn't even really notice the fumes. Contrast with nitric where fumes are an inherent problem from the nitrate decomposition.
-Inexpensive
-Readily availible materials? Battery acid and drain cleaner are readily availible. Battery acid tends to be purer and would likely need to be distilled first, but drain cleaner (ex: Bull Dozer) is much stronger but with contaminants. In any case, generally not a controlled substance and one should be able to order it without too much trouble.
Disadvantages
-Higher working temperature. Might literally take your hand fall off if you spilled on it. When I was younger a single drop of cold concentrated sulphuric landed on my hand and caused a severe burn to which I'm reminded to this day by a scar. I can't imagine what a broken boiling beaker could do.
-From the solution turning black and the lack of bubbles, no clear indication of when its "done." Combined with the larger cool down time of the acid and glassware, this can make it inefficient for doing small batches.
Overall, probably a good compromise for those that want to try some of this stuff
-Grainy appearance on dies? Need to look more into what that came from

With this in mind, one good application might be to use it as a wash after nitric. Since I've found issues with particulate residues after nitric, a brief sulphuric bath might be able to clear them off. I think sulphuric works at lower (ie room) temperatures, albeit much slower, so it might not even require heating. I'll probably try to soak a fully encapsulated IC overnight and see how it goes as a starter.

Monday, November 8, 2010

Back to Troy, NY

After spending the summer in Cambridge, MA and back to SF Bay area for a few weeks, I've been back to Troy, NY. What makes Troy special? I'll tell you...
Luxurious homes
Expensive cars
Booming industry
And fine art
Okay, so its not quite as bad as I make it look, but most of these were taken pretty close to my apartment. To be fair, they've been knocking down a lot of the old buildings and graffiti is pretty rare except when this construction wall went up at RPI and people went nuts. I'll omit those pictures as if the first pictures don't get me hate mail, I get the feeling RPI might give me a "strong suggestion" to take down the latter.
Now that the small talk is out of the way, on to business. Although I haven't been posting anything, a lot has been happening. First, the microscope I previously mentioned never came, but eBay refunded me. However, my room-mate bought a metallurgical microscope with USB camera, so I'm better off than ever. Being off campus now, I also have less restrictions and don't have to deal with RA BS and such. One perk of my apartment is that I got some lab space in an area that's being remodelled. Its going to go away in January, but I'm hopefully moving out then anyway, so that shouldn't really effect me. The end effect of this is that I'm finally getting a chance to do all of the stuff I wanted to before and actually have some time and space to try things out.
I've imaged a bunch more IC pictures. In particular, I have images of discrete transistors, fully delayered 7404 hex-inverter, and other ICs.
3906 top metal
Old TMS320 logo section
On that note, a die image archive was started at http://intruded.net:8080/wiki/ Since I like Wiki's, I got myself an account and you should expect to see any die images I publicly release to appear there. I posted a few from a bit back, but haven't gone on a rampage yet. One of the things they are working on is getting a "Google Maps" style IC viewer for larger ICs. A crude test page is at http://intruded.net:8080/map/ (you'll have to zoom to correct level).
Map view test for large ICs
Regarding http://siliconpr0n.wikispaces.com/, I recently got permission from Sergei P. Skorobogatov to include images from his Semi Invasive Attacks paper on the Wiki as long as they are credited to him. So, along with the other material I've been accumulating from my own research, expect some rapid expansion on the Wiki in the near future.
After delayering a few 7400 series ICs, I've realized I had in fact been at the transistor layer before, but just didn't understand what I was looking at. Probably bad been confused by all of the MOS pictures I had seen? In any case, I tried a 74163, but found it was too complex to start with. I could only recognize a handful of components. A few days ago I delayered a 7404 which should provide a much cleaner reference circuit since its small and more or less split into 6 regular units. Unfortunately, I let it sit for a while without agitating it, so it crystallized a bit, but should be fine for my purposes.
7404 transistors
A quick overview of the techniques I currently use and why. Most ICs are in epoxy. I boil them in 70% nitric until the epoxy is removed. Lacking an ultrsonic cleaner, I wash them in room temp 3% HF for about a minute to clean the surface. This takes a thin layer off the top, which removes most debris. Then, depending on how patient I'm feeling, either room temperature or near boiling 3% HF to delayer the IC. If I want to keep it suitable for live analysis (mostly my roommate has been doing this), a Dremel "drill press" with a small endmill is used to make a cavity above the die. We use a rough estimate, usually slightly above the pins, to guess how far to go down. The package is pre-heated to 300F and a drop is put on top, allowed to etch, and washed with acetone before it dries out. Heating the acid doesn't seem to make a difference as its heat is negligible (plus transfer cool off) compared to many IC packages. I also played around briefly with another low cost method that is more automatic but less selective, I'll try to post something on that soon.
Finally, I'm interviewing with various companies and looking for a job, so if you think you might be interested in me, feel free to send me an e-mail at JohnDMcMaster gmail.com.

Sunday, July 25, 2010

End of summer plans

As my summer internship comes to a close, I was never able to get access to microscopes over here. If i had bugged a large number of people I might have, but there was so much else to do in Boston that its just as well.
That doesn't mean that I haven't been preparing though. For starts, I've accumulated a small hoard of Intel CPUs that I will hope in the near future to image. The microscope I bought a bit back wasn't as good as I thought it was, although it should still be fine for short term work coming up. I'm very grateful to be able to borrow the microscope at RPI, but for a number of reasons it is good to get my own. After searching around for a large portion of the summer, a decent quality Olympus microscope showed up on eBay finally:
What made this affordable? Two things. First, it was a first time eBay seller. Second, it coming from Thailand. So there's a bit of risk involved. Usually people don't scam on scientific items and if they do I can probably do a chargeback or possibly even a PayPal dispute. I'm not sure on their policies with other countries. If you scream loud enough, someone will usually do something if bad things happen.
The other part is I looked briefly into what it would take for me to order RFNA. From the companies I saw, they required you to be associated with a business. I'm not sure if I could play any tricks with my school, but from the business standpoint it would be very expensive. Yearly business registration fees seem to be pretty high in CA (something like $800 a year if I recall) which is not justified for this project.
Next, I've aquired some ground glass glassware, which has been on my TODO list since early high school. If I need to produce any chemicals for analysis now, it should be considerably easier and of higher quality.
I fly home for CA in two weeks. While there, my plan is to image a number of 7400 series logic chips to form a practical foundation. This should also be useful for others to study from. I could have bought some from Digikey etc, but I am thinking they probably use newer process technologies that would be harder to analyse. So, I heat gunned some off of old circuit boards and will be using those. Assuming those three weeks go well and my microscope arrives at RPI, as permitting during the school year I will start imaging the Intel chips.
Finally, I may be involved playing a small part in a commercial project coming up. While I will not be able to disclose the details of it, part of it would include them buying all of the acid and such I need to decap the chips, giving me an idea of how using commercial grade equipment contrasts. My role will be focused on the decapsulating and possibly aiding in the analysis.

Saturday, May 15, 2010

Biological, inverted microscope layer image comparison

Biological, top metal (220X):
Biological interconnect doesn't image well as all black. Sample for comparison (not same area):
If you are stuck with a biological microscope, set the intensity as high as you can and you should be able to make things out with your eye, but cameras might have more difficulty.
Inverted metallurgical (inverted), top metal (440X):
Inverted metallurgical (inverted), interconnect (440X):
Although all three are from the same chip type, only the two inverted images are from the exact same chip. NOTE: I have done a vertical flip on the inverted images to correct them to what the actual object is like (as in the biological image).
For those wondering where the magnification levels come from, it is a combination of the eyepiece, objective, and camera magnification. For the biological microscope, the image was shot at 10X eyepiece * 10X objective * 2.2X camera = 220X. For the inverted metallurgical microscope, the images were shot at 10X eyepiece * 20X objective * 2.2X camera = 440X. Thus, I (poorly) manually stitched two images together from each to get closer to the size of the biological image.
Alas, despite my best efforts, I still can't see transistors. Today I tried some extended hot baths in HF acid on the chip above and even tried adding some 30% H202 which dramatically increases the action of the HF. I'm letting it sit overnight (or maybe till Wed after finals) and see if it ate through. Maybe these chips are just resistant? I'll have to go back to sanding though to get other sample since I haven't gotten another technique to work yet on epoxied ICs.