Yup, that's the simplest solution, and I won't knock it for its simplicity, but I'm after a more solid, pixel-perfect solution. If only I'd known the torture it would bring me while searching!
@jerejigga - there is a way to avoid it (I mean, apart from thomas6's suggestion above).
Although the problem happens slightly when shrunk, this is most noticable when you zoom a tile image using "linear" filtering, so I'll concentrate on explaining why that happens.
Linear filtering uses a nice blend between pixel colours. Say you have 2 pixels in the source image next to each other. White and black. Now, when you show that image zoomed, you'll actually see a smooth fade from white to black. The GPU is linearly blending between pixels, so they don't look all pixelated and silly up close (unless that is what you want!).
Now, with that in mind, what happens when you try to draw the edge of a tile sprite? Well, it depends on how you set up your source tile set image.
If each source tile touches the tiles next to it, when you try to draw your view, each sprite will have a border where the tile you want's edge pixels blend into those of the tile next to it. Often not good.
The next possibility (and what I recommend) is that individual tile graphics don't touch either other in the source tile set image. You leave ideally a 2 pixel or greater gap between them all, both horizontally and vertically. However, this now means that when you draw a tile sprite, the edge pixels are now blending from the actual graphic, into pure transparent. This is where the gaps come from.
So, the proper solution (and this is a known thing when working on this type of game), is to do your tile set graphics, then expand the edge pixels of each tile graphic by 1 - IE for each tile you duplicate its left edge one pixel to its left, its right edge one pixel to its right, and the same for up and down etc. This means edge pixels will be blended with a copy of themselves, so they'll look 99% correct - hurrah!
However, this is a pain if you are still editing the tile set graphics. It is a tedious and frustrating task at the best of times. If you don't do it though, your graphics in-game while testing look rubbish.
So, what I've been struggling to do, and with some degree of success, is using the new canvas object to automatically expand the borders.
Essentially what I do is load the tile set graphic into a canvas. Then I make it an image sheet, making sprite frames from the edge pixels of each tile that I need to expand. Then I :draw() a lot of sprites into the self-same canvas using these frames to offset the tile edges, finishing with an :invalidate(). Then I create another imagesheet from the canvas, which is the normal one I'd use for my tile sets as if I'd done none of this expansion stuff.
So what are my problems? Firstly you *must* do all this edge expansion stuff with the texture min and mag filters set to "nearest" or it just goes wrong. That alone isn't bad, but it means that any use of the tile set canvas later on will use these filters, which is likely *not* what you want to happen.
Secondly, because you actually want the corner pixels expanded, I'm doing my edge expansion in 2 sets. Firstly horizontal, then vertical. The vertical ones will use the already horizontally expanded source, so they now have these corner things. The problem is that for some reason I can't just update and invalidate() the canvas horizontally 1 frame after initialising it, then doing the same for vertical adjustments in the following frame after that. It *should* work, and indeed in the simulator it does. But on my device, I need to leave a delay of roughly 700 milliseconds between updates. This isn't a huge deal, but it means something is going on I'm not aware of, and means you can't start showing the level until approximately 1.5 seconds after you set it up. Not something I'd want in a publically released library :/