User-Guided Line Abstraction Using Coherence and Structure Analysis

Hui-Chi Tsai
Ya-Hsuan Lee
Computational Visual Media (Proc. of Computational Visual Media 2017)


Line drawing is a style of image abstraction where the perception of image is conveyed using distinct straight or curved lines. However, extracting semantically salient lines is not trivial and mastered only by skilled artists. While many parametric filters have successfully extracted accurate and coherent lines, their results are sensitive to parameters tuning and easily leading to either excessive or insufficient amount of lines. In this work, we present an interactive system to generate concise line abstraction of arbitrary images via a few user specified strokes. Specifically, the user simply has to provide a few intuitive strokes on the input images, including tracing roughly along the edges and scribbling on the region of interest, through a sketching interface. The system then automatically extracts lines that are long, coherent and share similar textural structures from a corresponding highly detailed line drawing image. We have tested our system with a wide variety of images. The experimental results show that our system outperforms state-of-the-art techniques in terms of quality and efficiency.


Given an input image (a), the system requests users to provide a few simple and intuitive strokes (b) and generates a reference dataset of line segments from a detailed line drawing image (c). Then the system first classifies the user strokes into so called coherence strokes (d) and structure strokes (e). A novel line matching algorithm is further employed to match the line segments of (c) with respect to the input coherence and structure strokes. The best matching coherence lines (f) and structure lines (g) are combined to form the final line abstraction (h).


(a) Input image. (b) Detailed line drawings by fDoG. (c) The user strokes. (d) Final line abstractions.


We are grateful to the anonymous reviewers for their comments and suggestions. The work was supported in part by the Ministry of Science and Technology of Taiwan (103-2221-E-007-065-MY3, and 105-2221-E-007-104-MY2).