Thats just because of connecting different images into "one" on the spherical body.
Another example of this sort of google artifact - 77°39'21.64"N 24°58'29.70"E (note distinctive edges of separate images, just zoom in/out few times... on google Earth, of course):
Despite the fact that it's been very well explained by others why this is a glitch, I thought I'd come at this from another direction.
I came across this topic on another forum and while googling the subject, found this thread.
Anyway, the problem is that people don't understand the resolution of the photo they're looking at. If you zoom in on Google Mars so that "Bio Station Alpha" is nice and big, you can see that the terrain surrounding it is very blurry and blocky. There's no detail whatsoever - no rocks or ground detail is visible - just a blurry mess of a background. If you examine it you can actually see the vertical and horizontal color banding that represents the actual pixels of the original image. You can see that the original pixels are actually about the same size as the north-south diameter of "Bio Station Alpha". What's happening here is that Google Earth is doing something similar to the anti-aliasing that games and graphics software does to smooth out low resolution textures - a blurry texture looks better than a blocky texture.
If you look at the Mars desert background surrounding "Bio Station Alpha". you can see the same vertical fading amongst the blocks that represent the pixels of the original digital image. It's just more subtle because there's no much difference in color amongst the dull browns of the Martian desert. However, if there's a high contrast between a feature and the surrounding terrain, that vertical color fading is more obvious.
It's trivially easy to demonstrate this. To the east of "Bio Station Alpha" there's some sort of geographical feature that has lots of objects randomly scattered in the desert. The objects are brighter than the background so stand out easily. At a distance you can see that they are irregularly shaped and randomly distributed. I suspect that the color contrast is an artefact of either how the image was processed or the time of day the picture was taken. You can see that east of "Bio Station Alpha" there is an obvious vertical discontinuity where 2 images were stitched together in Google Earth and to the west of the discontinuity, the white objects don't appear. Here's a screen grab of the area in question.
If you zoom in closer, you'll see something like the following: (note the low resolution of the pics - you can see the blockiness of the pixels even though Google is trying to blur it out)
Zoom in, take a screenshot, and load it into Photoshop or a similar graphics package. Rotate it to a jaunty angle, apply some brightness and contrast to artificially make it look better, like the image of "Bio Station Alpha" that was already post several times, and hey presto! you've got yourself a little cluster of "Mini Bio Stations"!
The problem is clearly that lots of people don't realise they are looking at an extremely low resolution image blown up. They think those color gradients of the "cylinders" is photographic detail when it's just Google's automatic image processing blurring out the pixels. Like I said, you can even see the same effect in the brown background, it's just subtler because of the more subtle range of colours.
Yes, it's not as interesting or artificial looking as "Bio Station Alpha", but that's because the same post-processing on an image that had a neat row of white pixels because some sort of data glitch (as was very well explained by someone else, using 3 images from a series to show that there was only one with that row of white pixels in this location), produce a more striking effect. What I did wasn't as effect as it was with oddly shaped natural formations with less dramatic color drop off than nice square white pixels on a brown terrain.