1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195
|
(************************************************************************)
(* v * The Coq Proof Assistant / The Coq Development Team *)
(* <O___,, * INRIA - CNRS - LIX - LRI - PPS - Copyright 1999-2014 *)
(* \VV/ **************************************************************)
(* // * This file is distributed under the terms of the *)
(* * GNU Lesser General Public License Version 2.1 *)
(************************************************************************)
{
open Lexing
type token =
| Comment
| Keyword
| Declaration
| ProofDeclaration
| Qed
| String
(* Without this table, the automaton would be too big and
ocamllex would fail *)
let tag_of_ident =
let one_word_commands =
[ "Add" ; "Check"; "Eval"; "Extraction" ;
"Load" ; "Undo"; "Goal";
"Proof" ; "Print";"Save" ; "Restart";
"End" ; "Section"; "Chapter"; "Transparent"; "Opaque"; "Comments" ]
in
let one_word_declarations =
[ (* Definitions *)
"Definition" ; "Let" ; "Example" ; "SubClass" ;
"Fixpoint" ; "CoFixpoint" ; "Scheme" ; "Function" ;
(* Assumptions *)
"Hypothesis" ; "Variable" ; "Axiom" ; "Parameter" ; "Conjecture" ;
"Hypotheses" ; "Variables" ; "Axioms" ; "Parameters";
(* Inductive *)
"Inductive" ; "CoInductive" ; "Record" ; "Structure" ;
(* Other *)
"Ltac" ; "Instance"; "Include"; "Context"; "Class" ;
"Arguments" ]
in
let proof_declarations =
[ "Theorem" ; "Lemma" ; " Fact" ; "Remark" ; "Corollary" ;
"Proposition" ; "Property" ]
in
let proof_ends =
[ "Qed" ; "Defined" ; "Admitted"; "Abort" ]
in
let constr_keywords =
[ "forall"; "fun"; "match"; "fix"; "cofix"; "with"; "for";
"end"; "as"; "let"; "in"; "if"; "then"; "else"; "return";
"Prop"; "Set"; "Type" ]
in
let h = Hashtbl.create 97 in (* for vernac *)
let h' = Hashtbl.create 97 in (* for constr *)
List.iter (fun s -> Hashtbl.add h s Keyword) one_word_commands;
List.iter (fun s -> Hashtbl.add h s Declaration) one_word_declarations;
List.iter (fun s -> Hashtbl.add h s ProofDeclaration) proof_declarations;
List.iter (fun s -> Hashtbl.add h s Qed) proof_ends;
List.iter (fun s -> Hashtbl.add h' s Keyword) constr_keywords;
(fun initial id -> Hashtbl.find (if initial then h else h') id)
exception Unterminated
let here f lexbuf = f (Lexing.lexeme_start lexbuf) (Lexing.lexeme_end lexbuf)
}
let space =
[' ' '\n' '\r' '\t' '\012'] (* '\012' is form-feed *)
let firstchar =
['$' 'A'-'Z' 'a'-'z' '_' '\192'-'\214' '\216'-'\246' '\248'-'\255']
let identchar =
['$' 'A'-'Z' 'a'-'z' '_' '\192'-'\214' '\216'-'\246' '\248'-'\255' '\'' '0'-'9']
let ident = firstchar identchar*
let undotted_sep = [ '{' '}' '-' '+' '*' ]
let dot_sep = '.' (space | eof)
let multiword_declaration =
"Module" (space+ "Type")?
| "Program" space+ ident
| "Existing" space+ "Instance" "s"?
| "Canonical" space+ "Structure"
let locality = (space+ "Local")?
let multiword_command =
("Uns" | "S")" et" (space+ ident)*
| (("Open" | "Close") locality | "Bind" | " Delimit" )
space+ "Scope"
| (("Reserved" space+)? "Notation" | "Infix") locality space+
| "Next" space+ "Obligation"
| "Solve" space+ "Obligations"
| "Require" space+ ("Import"|"Export")?
| "Hint" locality space+ ident
| "Reset" (space+ "Initial")?
| "Tactic" space+ "Notation"
| "Implicit" space+ "Type" "s"?
| "Combined" space+ "Scheme"
| "Extraction" space+ (("Language" space+ ("Ocaml"|"Haskell"|"Scheme"|"Toplevel"))|
("Library"|"Inline"|"NoInline"|"Blacklist"))
| "Recursive" space+ "Extraction" (space+ "Library")?
| ("Print"|"Reset") space+ "Extraction" space+ ("Inline"|"Blacklist")
| "Extract" space+ (("Inlined" space+) "Constant"| "Inductive")
| "Typeclasses" space+ ("eauto" | "Transparent" | "Opaque")
| ("Generalizable" space+) ("All" | "No")? "Variable" "s"?
(* At least still missing: "Inline" + decl, variants of "Identity
Coercion", variants of Print, Add, ... *)
rule coq_string = parse
| "\"\"" { coq_string lexbuf }
| "\"" { Lexing.lexeme_end lexbuf }
| eof { Lexing.lexeme_end lexbuf }
| _ { coq_string lexbuf }
and comment = parse
| "(*" { ignore (comment lexbuf); comment lexbuf }
| "\"" { ignore (coq_string lexbuf); comment lexbuf }
| "*)" { (true, Lexing.lexeme_start lexbuf + 2) }
| eof { (false, Lexing.lexeme_end lexbuf) }
| _ { comment lexbuf }
and sentence initial stamp = parse
| "(*" {
let comm_start = Lexing.lexeme_start lexbuf in
let trully_terminated,comm_end = comment lexbuf in
stamp comm_start comm_end Comment;
if not trully_terminated then raise Unterminated;
(* A comment alone is a sentence.
A comment in a sentence doesn't terminate the sentence.
Note: comm_end is the first position _after_ the comment,
as required when tagging a zone, hence the -1 to locate the
")" terminating the comment.
*)
if initial then comm_end - 1 else sentence false stamp lexbuf
}
| "\"" {
let str_start = Lexing.lexeme_start lexbuf in
let str_end = coq_string lexbuf in
stamp str_start str_end String;
sentence false stamp lexbuf
}
| multiword_declaration {
if initial then here stamp lexbuf Declaration;
sentence false stamp lexbuf
}
| multiword_command {
if initial then here stamp lexbuf Keyword;
sentence false stamp lexbuf
}
| ident as id {
(try here stamp lexbuf (tag_of_ident initial id) with Not_found -> ());
sentence false stamp lexbuf }
| ".." {
(* We must have a particular rule for parsing "..", where no dot
is a terminator, even if we have a blank afterwards
(cf. for instance the syntax for recursive notation).
This rule and the following one also allow to treat the "..."
special case, where the third dot is a terminator. *)
sentence false stamp lexbuf
}
| dot_sep { Lexing.lexeme_start lexbuf } (* The usual "." terminator *)
| undotted_sep {
(* Separators like { or } and bullets * - + are only active
at the start of a sentence *)
if initial then Lexing.lexeme_start lexbuf
else sentence false stamp lexbuf
}
| space+ {
(* Parsing spaces is the only situation preserving initiality *)
sentence initial stamp lexbuf
}
| _ {
(* Any other characters *)
sentence false stamp lexbuf
}
| eof { raise Unterminated }
{
(** Parse a sentence in string [slice], tagging relevant parts with
function [stamp], and returning the position of the first
sentence delimitor (either "." or "{" or "}" or the end of a comment).
It will raise [Unterminated] when no end of sentence is found.
*)
let delimit_sentence stamp slice =
sentence true stamp (Lexing.from_string slice)
}
|