All you need are the barycentric coordinates, since you are dealing with triangles. Assign each vertex of the triangle an identity, and then use the hardware's built-in interpolation between the vertex and fragment stages to figure out the relative distance from each of the vertices in your fragment shader.
You can think of the barycentric coordinates for each vertex as the distance from the opposite edge. In the diagram below, vertex P0's opposite edge is e1, and its distance is represented by h1; its barycentric coordinate is <0.0, h1, 0.0>
. GPUs may use this coordinate space internally to interpolate vertex attributes for triangles when fragments are generated during rasterization, it makes quick work of weighting per-vertex properties based on location within a triangle.
Below are two tutorials that explain how to do this, typically this is used for rendering a wireframe overlay so you might have better luck searching for that. For your purposes, since this is effectively a specialization of wireframe rendering (with the addition that you want to throw out lines that do not belong to exterior polygon edges), you will need to identify edge vertices and perform additional processing.
For instance, if a vertex is not part of an exterior edge, then you will want to assign it a barycentric coordinate of something like <1,100,0> and the connected vertex <0,100,1> and the interior edge will be ignored (assuming it is an edge opposite the vertex designated <0,1,0>, as seen in the diagram below). The idea is that you never want a point along this edge to interpolate anywhere near 0.0 (or whatever your threshold you use for shading a fragment as part of the border), making it extremely distant from the center of the triangle in the direction of the opposite vertex will solve this.
Without Geometry Shaders (OpenGL ES friendly):
Here's a link explaining how to do this, if you are able to modify your vertex data to hold the barycentric coordinates. It has higher storage and pre-processing requirements (in particular, sharing of vertices between adjacent edges may no longer be possible since you need each triangle to consist of three vertices that each have a different input barycentric coordinate - which is why geometry shaders are a desirable solution). However, it will run on a lot more OpenGL ES class hardware than more general solutions that require geometry shaders.
https://web.archive.org/web/20190220052115/http://codeflow.org/entries/2012/aug/02/easy-wireframe-display-with-barycentric-coordinates/
With Geometry Shaders (Not OpenGL ES friendly):
Alternatively, you can use a geometry shader to compute the barycentric coordinates for each triangle at render-time as seen in this tutorial. Chances are in OpenGL ES you will not have access to geometry shaders, so this can probably be ignored.
http://strattonbrazil.blogspot.com/2011/09/single-pass-wireframe-rendering_10.html
http://strattonbrazil.blogspot.com/2011/09/single-pass-wireframe-rendering_11.html
The theoretical basis for this solution can be found here (courtesy of the Internet Archive Wayback Machine):
http://web.archive.org/web/*/http://cgg-journal.com/2008-2/06/index.html