6 Commits

Author SHA1 Message Date
greg
7c27cace9f Add note 2018-09-17 00:03:29 -07:00
greg
23c54ae186 proc macro feature no longer needed 2018-09-10 21:28:34 -07:00
greg
015840ac38 Get program compileable 2018-09-10 21:28:14 -07:00
greg
b8487aa0d4 Add note from Sergei W. on subtyping 2018-09-10 21:01:56 -07:00
greg
bac5761534 Add a note 2018-09-06 02:12:37 -07:00
greg
926631ba8f Pattern matching experimental code
WIP
2018-08-29 03:00:54 -07:00
53 changed files with 5113 additions and 6361 deletions

View File

@@ -6,10 +6,11 @@ authors = ["greg <greg.shuflin@protonmail.com>"]
[dependencies]
schala-repl = { path = "schala-repl" }
schala-lang = { path = "schala-lang/language" }
# maaru-lang = { path = "maaru" }
# rukka-lang = { path = "rukka" }
# robo-lang = { path = "robo" }
schala-codegen = { path = "schala-codegen" }
maaru-lang = { path = "maaru" }
rukka-lang = { path = "rukka" }
robo-lang = { path = "robo" }
schala-lang = { path = "schala-lang" }
[build-dependencies]
includedir_codegen = "0.2.0"

31
Grammar Normal file
View File

@@ -0,0 +1,31 @@
<program> := <statements> EOF
<statements> := <statement>
| <statement> SEP <statements>
<statement> := let <id> = <expr>
| <expr>
| <fn_block>
<fn_block> := fn <id> ( <arg_list> ) <statements> end
<arg_list> := e
| <id>
| <id> , <arg_list>
<expr> := if <expr> then <statements> end
| if <expr> then <statements> else <statements> end
| while <expr> SEP <statements> end
| ( <expr> )
| <binop>
<binop> := <simple_expr>
| <simple_expr> <id> <binop>
<simple_expr> := <id>
| <number>
| <string>

View File

@@ -1,24 +1,21 @@
# Schala - a programming language meta-interpreter
Schala is a Rust framework written to make it easy to create and experiment
with multipl toy programming languages. It provides a cross-language REPL and
provisions for tokenizing text, parsing tokens, evaluating an abstract syntax
tree, and other tasks that are common to all programming languages, as well as sharing state
between multiple programming languages.
Schala is a Rust framework written to make it easy to
create and experiment with toy programming languages. It provides
a common REPL, and a trait `ProgrammingLanguage` with provisions
for tokenizing text, parsing tokens, evaluating an abstract syntax tree,
and other tasks that are common to all programming languages.
Schala is implemented as a Rust library `schala-repl`, which provides a
function `start_repl`, meant to be used as entry point into a common REPL or
non-interactive environment. Clients are expected to invoke `start_repl` with a
vector of programming languages. Individual programming language
implementations are Rust types that implement the
`ProgrammingLanguageInterface` trait and store whatever persistent state is
relevant to that language.
Schala is implemented as a Rust library `schala_lib`, which provides a
`schala_main` function. This function serves as the main loop of the REPL, if run
interactively, or otherwise reads and interprets programming language source
files. It expects as input a vector of `PLIGenerator`, which is a type representing
a closure that returns a boxed trait object that implements the `ProgrammingLanguage` trait,
and stores any persistent state relevant to that programming language. The ability
to share state between different programming languages is in the works.
Run schala with: `cargo run`. This will drop you into a REPL environment. Type
`:help` for more information, or type in text in any supported programming
language (currently only schala-lang) to evaluate it in the REPL.
## History
## About
Schala started out life as an experiment in writing a Javascript-like
programming language that would never encounter any kind of runtime value
@@ -36,18 +33,18 @@ creating a language name confusingly close to Scala. The naming scheme for
languages implemented with the Schala meta-interpreter is Chrono Trigger
characters.
Schala and languages implemented with it are incomplete alpha software and are
not ready for public release.
Schala is incomplete alpha software and is not ready for public release.
## Languages implemented using the meta-interpreter
* The eponymous *Schala* language is a work-in-progress general purpose
programming language with static typing and algebraic data types. Its design
goals include having a very straightforward implemenation and being syntactically
minimal.
* The eponymous *Schala* language is an interpreted/compiled scripting langauge,
designed to be relatively simple, but with a reasonably sophisticated type
system.
* *Maaru* is a very simple dynamically-typed scripting language, with the semantics
that all runtime errors return a `null` value rather than fail.
* *Maaru* was the original Schala (since renamed to free up the name *Schala*
for the above language), a very simple dynamically-typed scripting language
such that all possible runtime errors result in null rather than program
failure.
* *Robo* is an experiment in creating a lazy, functional, strongly-typed language
much like Haskell
@@ -59,21 +56,10 @@ much like Haskell
Here's a partial list of resources I've made use of in the process
of learning how to write a programming language.
### General
http://thume.ca/2019/04/18/writing-a-compiler-in-rust/
### Type-checking
https://skillsmatter.com/skillscasts/10868-inside-the-rust-compiler
https://www.youtube.com/watch?v=il3gD7XMdmA
http://dev.stephendiehl.com/fun/006_hindley_milner.html
https://rust-lang-nursery.github.io/rustc-guide/type-inference.html
https://eli.thegreenplace.net/2018/unification/
https://eli.thegreenplace.net/2018/type-inference/
http://smallcultfollowing.com/babysteps/blog/2017/03/25/unification-in-chalk-part-1/
http://reasonableapproximation.net/2019/05/05/hindley-milner.html
https://rickyhan.com/jekyll/update/2018/05/26/hindley-milner-tutorial-rust.html
### Evaluation
*Understanding Computation*, Tom Stuart, O'Reilly 2013
@@ -91,5 +77,4 @@ http://blog.ulysse.io/2016/07/03/llvm-getting-started.html
###Rust resources
https://thefullsnack.com/en/rust-for-the-web.html
https://rocket.rs/guide/getting-started/

147
TODO.md
View File

@@ -1,84 +1,24 @@
# TODO items
## General code cleanup
-experiment with storing metadata via ItemIds on AST nodes (cf. https://rust-lang.github.io/rustc-guide/hir.html, https://github.com/rust-lang/rust/blob/master/src/librustc/hir/mod.rs )
-implement and test open/use statements
-implement field access
- standardize on an error type that isn't String
-implement a visitor pattern for the use of scope_resolver
# TODO Items
## Reduction
- make a good type for actual language builtins to avoid string comparisons
-make sure to include a :doc command at the REPL that can interface with a lang in a generic way
## Typechecking
- a subtype is a situation where the compiler is entitled to add a type conversion in the type-checking process
b/c that type conversion doesn't correspond to a computation
-Sergei W.
- make a type to represent types rather than relying on string comparisons
- look at https://rickyhan.com/jekyll/update/2018/05/26/hindley-milner-tutorial-rust.html
- cf. the notation mentioned in the cardelli paper, the debug information for the `typechecking` pass should
print the generated type variable for every subexpression in an expression
- think about idris-related ideas of multiple implementations of a type for an interface (+ vs * impl for monoids, for preorder/inorder/postorder for Foldable)
-should have an Idris-like `cast To From` function
## Schala-lang syntax
-idea: the `type` declaration should have some kind of GADT-like syntax
- Idea: if you have a pattern-match where one variant has a variable and the other lacks it
instead of treating this as a type error, promote the bound variable to an option type
- Include extensible scala-style html"string ${var}" string interpolations
- A neat idea for pattern matching optimization would be if you could match on one of several things in a list
*A neat idea for pattern matching optimization would be if you could match on one of several things in a list
ex:
```if x {
if x {
is (comp, LHSPat, RHSPat) if comp in ["==, "<"] -> ...
}```
}
- Schala should have both currying *and* default arguments!
```fn a(b: Int, c:Int, d:Int = 1) -> Int
a(1,2) : Int
a(1,2,d=2): Int
a(_,1,3) : Int -> Int
a(1,2, c=_): Int -> Int
a(_,_,_) : Int -> Int -> Int -> Int
```
- scoped types - be able to define a quick enum type scoped to a function or other type for
something, that only is meant to be used as a quick bespoke interface between
two other things
ex.
```type enum {
type enum MySubVariant {
SubVariant1, SubVariant2, etc.
}
Variant1(MySubVariant),
Variant2(...),
}```
- inclusive/exclusive range syntax like .. vs ..=
## Compilation
-look into Inkwell for rust LLVM bindings
-https://cranelift.readthedocs.io/en/latest/?badge=latest<Paste>
## Other links of note
- https://nshipster.com/never/
-https://cranelift.readthedocs.io/en/latest/?badge=latest<Paste>
-consult http://gluon-lang.org/book/embedding-api.html
## Playing around with conditional syntax ideas
-
- if/match playground
simple if
@@ -117,3 +57,70 @@ if the only two guard patterns are true and false, then the abbreviated syntax:
`'if' discriminator 'then' block_or_expr 'else' block_or_expr`
can replace `'if' discriminator '{' 'true' 'then' block_or_expr; 'false' 'then' block_or_expr '}'`
- Next priorities: - get ADTs working, get matches working
- inclusive/exclusive range syntax like .. vs ..=
- sketch of an idea for the REPL:
-each compiler pass should be a (procedural?) macro like
compiler_pass!("parse", dataproducts: ["ast", "parse_tree"], {
match parsing::parse(INPUT) {
Ok(
PASS.add_artifact(
}
-should have an Idris-like `cast To From` function
- REPL:
- want to be able to do things like `:doc Identifier`, and have the language load up these definitions to the REPL
* change 'trait' to 'interface'
-think about idris-related ideas of multiple implementations of a type for an interface (+ vs * impl for monoids, for preorder/inorder/postorder for Foldable)
* Share state between programming languages
* idea for Schala - scoped types - be able to define a quick enum type scoped to a function ro something, that only is meant to be used as a quick bespoke interface between two other things
* another idea, allow:
type enum {
type enum MySubVariant {
SubVariant1, SubVariant2, etc.
}
Variant1(MySubVariant),
Variant2(...),
}
* idea for Schala: both currying *and* default arguments!
ex. fn a(b: Int, c:Int, d:Int = 1) -> Int
a(1,2) : Int
a(1,2,d=2): Int
a(_,1,3) : Int -> Int
a(1,2, c=_): Int -> Int
a(_,_,_) : Int -> Int -> Int -> Int
- AST : maybe replace the Expression type with "Ascription(TypeName, Box<Expression>) nodes??
- parser: add a "debug" field to the Parser struct for all debug-related things
-scala-style html"dfasfsadf${}" string interpolations!
*Compiler passes architecture
-ProgrammingLanguageInterface defines a evaluate_in_repl() and evaluate_no_repl() functions
-these take in a vec of CompilerPasses
struct CompilerPass {
name: String,
run: fn(PrevPass) -> NextPass
}
-change "Type...." names in parser.rs to "Anno..." for non-collision with names in typechecking.rs
-get rid of code pertaining to compilation specifically, have a more generation notion of "execution type"

279
maaru/src/compilation.rs Normal file
View File

@@ -0,0 +1,279 @@
extern crate llvm_sys;
use std::collections::HashMap;
use self::llvm_sys::prelude::*;
use self::llvm_sys::{LLVMIntPredicate};
use parser::{AST, Statement, Function, Prototype, Expression, BinOp};
use schala_repl::LLVMCodeString;
use schala_repl::llvm_wrap as LLVMWrap;
type VariableMap = HashMap<String, LLVMValueRef>;
struct CompilationData {
context: LLVMContextRef,
module: LLVMModuleRef,
builder: LLVMBuilderRef,
variables: VariableMap,
main_function: LLVMValueRef,
current_function: Option<LLVMValueRef>,
}
pub fn compile_ast(ast: AST) -> LLVMCodeString {
println!("Compiling!");
let names: VariableMap = HashMap::new();
let context = LLVMWrap::create_context();
let module = LLVMWrap::module_create_with_name("example module");
let builder = LLVMWrap::CreateBuilderInContext(context);
let program_return_type = LLVMWrap::Int64TypeInContext(context);
let main_function_type = LLVMWrap::FunctionType(program_return_type, Vec::new(), false);
let main_function: LLVMValueRef = LLVMWrap::AddFunction(module, "main", main_function_type);
let mut data = CompilationData {
context: context,
builder: builder,
module: module,
variables: names,
main_function: main_function,
current_function: None,
};
let bb = LLVMWrap::AppendBasicBlockInContext(data.context, data.main_function, "entry");
LLVMWrap::PositionBuilderAtEnd(builder, bb);
let value = ast.codegen(&mut data);
LLVMWrap::BuildRet(builder, value);
let ret = LLVMWrap::PrintModuleToString(module);
// Clean up. Values created in the context mostly get cleaned up there.
LLVMWrap::DisposeBuilder(builder);
LLVMWrap::DisposeModule(module);
LLVMWrap::ContextDispose(context);
LLVMCodeString(ret)
}
trait CodeGen {
fn codegen(&self, &mut CompilationData) -> LLVMValueRef;
}
impl CodeGen for AST {
fn codegen(&self, data: &mut CompilationData) -> LLVMValueRef {
let int_type = LLVMWrap::Int64TypeInContext(data.context);
let mut ret = LLVMWrap::ConstInt(int_type, 0, false);
for statement in self {
ret = statement.codegen(data);
}
ret
}
}
impl CodeGen for Statement {
fn codegen(&self, data: &mut CompilationData) -> LLVMValueRef {
use self::Statement::*;
match self {
&ExprNode(ref expr) => expr.codegen(data),
&FuncDefNode(ref func) => func.codegen(data),
}
}
}
impl CodeGen for Function {
fn codegen(&self, data: &mut CompilationData) -> LLVMValueRef {
/* should have a check here for function already being defined */
let function = self.prototype.codegen(data);
let ref body = self.body;
data.current_function = Some(function);
let return_type = LLVMWrap::Int64TypeInContext(data.context);
let mut ret = LLVMWrap::ConstInt(return_type, 0, false);
let block = LLVMWrap::AppendBasicBlockInContext(data.context, function, "entry");
LLVMWrap::PositionBuilderAtEnd(data.builder, block);
//insert function params into variables
for value in LLVMWrap::GetParams(function) {
let name = LLVMWrap::GetValueName(value);
data.variables.insert(name, value);
}
for expr in body {
ret = expr.codegen(data);
}
LLVMWrap::BuildRet(data.builder, ret);
// get basic block of main
let main_bb = LLVMWrap::GetBasicBlocks(data.main_function).get(0).expect("Couldn't get first block of main").clone();
LLVMWrap::PositionBuilderAtEnd(data.builder, main_bb);
data.current_function = None;
ret
}
}
impl CodeGen for Prototype {
fn codegen(&self, data: &mut CompilationData) -> LLVMValueRef {
let num_args = self.parameters.len();
let return_type = LLVMWrap::Int64TypeInContext(data.context);
let mut arguments: Vec<LLVMTypeRef> = vec![];
for _ in 0..num_args {
arguments.push(LLVMWrap::Int64TypeInContext(data.context));
}
let function_type =
LLVMWrap::FunctionType(return_type,
arguments,
false);
let function = LLVMWrap::AddFunction(data.module,
&*self.name,
function_type);
let function_params = LLVMWrap::GetParams(function);
for (index, param) in function_params.iter().enumerate() {
let name = self.parameters.get(index).expect(&format!("Failed this check at index {}", index));
let new = *param;
LLVMWrap::SetValueName(new, name);
}
function
}
}
impl CodeGen for Expression {
fn codegen(&self, data: &mut CompilationData) -> LLVMValueRef {
use self::BinOp::*;
use self::Expression::*;
let int_type = LLVMWrap::Int64TypeInContext(data.context);
let zero = LLVMWrap::ConstInt(int_type, 0, false);
match *self {
Variable(ref name) => *data.variables.get(&**name).expect(&format!("Can't find variable {}", name)),
BinExp(Assign, ref left, ref right) => {
if let Variable(ref name) = **left {
let new_value = right.codegen(data);
data.variables.insert((**name).clone(), new_value);
new_value
} else {
panic!("Bad variable assignment")
}
}
BinExp(ref op, ref left, ref right) => {
let lhs = left.codegen(data);
let rhs = right.codegen(data);
op.codegen_with_ops(data, lhs, rhs)
}
Number(ref n) => {
let native_val = *n as u64;
let int_value: LLVMValueRef = LLVMWrap::ConstInt(int_type, native_val, false);
int_value
}
Conditional(ref test, ref then_expr, ref else_expr) => {
let condition_value = test.codegen(data);
let is_nonzero =
LLVMWrap::BuildICmp(data.builder,
LLVMIntPredicate::LLVMIntNE,
condition_value,
zero,
"ifcond");
let func = LLVMWrap::GetBasicBlockParent(LLVMWrap::GetInsertBlock(data.builder));
let mut then_block =
LLVMWrap::AppendBasicBlockInContext(data.context, func, "then_block");
let mut else_block =
LLVMWrap::AppendBasicBlockInContext(data.context, func, "else_block");
let merge_block =
LLVMWrap::AppendBasicBlockInContext(data.context, func, "ifcont");
// add conditional branch to ifcond block
LLVMWrap::BuildCondBr(data.builder, is_nonzero, then_block, else_block);
// start inserting into then block
LLVMWrap::PositionBuilderAtEnd(data.builder, then_block);
// then-block codegen
let then_return = then_expr.codegen(data);
LLVMWrap::BuildBr(data.builder, merge_block);
// update then block b/c recursive codegen() call may have changed the notion of
// the current block
then_block = LLVMWrap::GetInsertBlock(data.builder);
// then do the same stuff again for the else branch
//
LLVMWrap::PositionBuilderAtEnd(data.builder, else_block);
let else_return = match *else_expr {
Some(ref e) => e.codegen(data),
None => zero,
};
LLVMWrap::BuildBr(data.builder, merge_block);
else_block = LLVMWrap::GetInsertBlock(data.builder);
LLVMWrap::PositionBuilderAtEnd(data.builder, merge_block);
let phi = LLVMWrap::BuildPhi(data.builder, int_type, "phinode");
let values = vec![then_return, else_return];
let blocks = vec![then_block, else_block];
LLVMWrap::AddIncoming(phi, values, blocks);
phi
}
Block(ref exprs) => {
let mut ret = zero;
for e in exprs.iter() {
ret = e.codegen(data);
}
ret
}
ref e => {
println!("Unimplemented {:?}", e);
unimplemented!()
}
}
}
}
impl BinOp {
fn codegen_with_ops(&self, data: &CompilationData, lhs: LLVMValueRef, rhs: LLVMValueRef) -> LLVMValueRef {
use self::BinOp::*;
macro_rules! simple_binop {
($fnname: expr, $name: expr) => {
$fnname(data.builder, lhs, rhs, $name)
}
}
let int_type = LLVMWrap::Int64TypeInContext(data.context);
match *self {
Add => simple_binop!(LLVMWrap::BuildAdd, "addtemp"),
Sub => simple_binop!(LLVMWrap::BuildSub, "subtemp"),
Mul => simple_binop!(LLVMWrap::BuildMul, "multemp"),
Div => simple_binop!(LLVMWrap::BuildUDiv, "divtemp"),
Mod => simple_binop!(LLVMWrap::BuildSRem, "remtemp"),
Less => {
let pred: LLVMValueRef =
LLVMWrap::BuildICmp(data.builder, LLVMIntPredicate::LLVMIntULT, lhs, rhs, "tmp");
LLVMWrap::BuildZExt(data.builder, pred, int_type, "temp")
}
Greater => {
let pred: LLVMValueRef =
LLVMWrap::BuildICmp(data.builder, LLVMIntPredicate::LLVMIntUGT, lhs, rhs, "tmp");
LLVMWrap::BuildZExt(data.builder, pred, int_type, "temp")
}
ref unknown => panic!("Bad operator {:?}", unknown),
}
}
}

View File

@@ -5,6 +5,9 @@ extern crate schala_repl;
mod tokenizer;
mod parser;
mod eval;
mod compilation;
use schala_repl::{ProgrammingLanguageInterface, EvalOptions, UnfinishedComputation, FinishedComputation, TraceArtifact};
#[derive(Debug)]
pub struct TokenError {
@@ -31,42 +34,6 @@ impl<'a> Maaru<'a> {
}
}
/*
fn execute_pipeline(&mut self, input: &str, options: &EvalOptions) -> Result<String, String> {
let mut output = UnfinishedComputation::default();
let tokens = match tokenizer::tokenize(input) {
Ok(tokens) => {
if let Some(_) = options.debug_passes.get("tokens") {
output.add_artifact(TraceArtifact::new("tokens", format!("{:?}", tokens)));
}
tokens
},
Err(err) => {
return output.finish(Err(format!("Tokenization error: {:?}\n", err.msg)))
}
};
let ast = match parser::parse(&tokens, &[]) {
Ok(ast) => {
if let Some(_) = options.debug_passes.get("ast") {
output.add_artifact(TraceArtifact::new("ast", format!("{:?}", ast)));
}
ast
},
Err(err) => {
return output.finish(Err(format!("Parse error: {:?}\n", err.msg)))
}
};
let mut evaluation_output = String::new();
for s in self.evaluator.run(ast).iter() {
evaluation_output.push_str(s);
}
Ok(evaluation_output)
}
*/
/*
impl<'a> ProgrammingLanguageInterface for Maaru<'a> {
fn get_language_name(&self) -> String {
"Maaru".to_string()
@@ -74,5 +41,63 @@ impl<'a> ProgrammingLanguageInterface for Maaru<'a> {
fn get_source_file_suffix(&self) -> String {
format!("maaru")
}
}
fn execute_pipeline(&mut self, input: &str, options: &EvalOptions) -> FinishedComputation {
let mut output = UnfinishedComputation::default();
let tokens = match tokenizer::tokenize(input) {
Ok(tokens) => {
if let Some(_) = options.debug_passes.get("tokens") {
output.add_artifact(TraceArtifact::new("tokens", format!("{:?}", tokens)));
}
tokens
},
Err(err) => {
return output.finish(Err(format!("Tokenization error: {:?}\n", err.msg)))
}
};
let ast = match parser::parse(&tokens, &[]) {
Ok(ast) => {
if let Some(_) = options.debug_passes.get("ast") {
output.add_artifact(TraceArtifact::new("ast", format!("{:?}", ast)));
}
ast
},
Err(err) => {
return output.finish(Err(format!("Parse error: {:?}\n", err.msg)))
}
};
let mut evaluation_output = String::new();
for s in self.evaluator.run(ast).iter() {
evaluation_output.push_str(s);
}
output.finish(Ok(evaluation_output))
}
/* TODO make this work with new framework */
/*
fn can_compile(&self) -> bool {
true
}
fn compile(&mut self, input: &str) -> LLVMCodeString {
let tokens = match tokenizer::tokenize(input) {
Ok(tokens) => tokens,
Err(err) => {
let msg = format!("Tokenization error: {:?}\n", err.msg);
panic!("{}", msg);
}
};
let ast = match parser::parse(&tokens, &[]) {
Ok(ast) => ast,
Err(err) => {
let msg = format!("Parse error: {:?}\n", err.msg);
panic!("{}", msg);
}
};
compilation::compile_ast(ast)
}
*/
}

View File

@@ -4,7 +4,7 @@ extern crate itertools;
extern crate schala_repl;
use itertools::Itertools;
use schala_repl::{ProgrammingLanguageInterface, EvalOptions};
use schala_repl::{ProgrammingLanguageInterface, EvalOptions, FinishedComputation, UnfinishedComputation};
pub struct Robo {
}
@@ -154,5 +154,17 @@ impl ProgrammingLanguageInterface for Robo {
fn get_source_file_suffix(&self) -> String {
format!("robo")
}
fn execute_pipeline(&mut self, input: &str, _eval_options: &EvalOptions) -> FinishedComputation {
let output = UnfinishedComputation::default();
let tokens = match tokenize(input) {
Ok(tokens) => tokens,
Err(e) => {
return output.finish(Err(format!("Tokenize error: {:?}", e)));
}
};
output.finish(Ok(format!("{:?}", tokens)))
}
}

View File

@@ -4,7 +4,7 @@ extern crate itertools;
extern crate schala_repl;
use itertools::Itertools;
use schala_repl::{ProgrammingLanguageInterface, EvalOptions};
use schala_repl::{ProgrammingLanguageInterface, EvalOptions, UnfinishedComputation, FinishedComputation};
use std::iter::Peekable;
use std::vec::IntoIter;
use std::str::Chars;
@@ -72,6 +72,24 @@ impl ProgrammingLanguageInterface for Rukka {
fn get_source_file_suffix(&self) -> String {
format!("rukka")
}
fn execute_pipeline(&mut self, input: &str, _eval_options: &EvalOptions) -> FinishedComputation {
let output = UnfinishedComputation::default();
let sexps = match read(input) {
Err(err) => {
return output.finish(Err(format!("Error: {}", err)));
},
Ok(sexps) => sexps
};
let output_str: String = sexps.into_iter().enumerate().map(|(i, sexp)| {
match self.state.eval(sexp) {
Ok(result) => format!("{}: {}", i, result.print()),
Err(err) => format!("{} Error: {}", i, err),
}
}).intersperse(format!("\n")).collect();
output.finish(Ok(output_str))
}
}
impl EvaluatorState {

12
schala-codegen/Cargo.toml Normal file
View File

@@ -0,0 +1,12 @@
[package]
name = "schala-codegen"
version = "0.1.0"
authors = ["greg <greg.shuflin@protonmail.com>"]
[dependencies]
syn = { version = "0.13.1", features = ["full", "extra-traits"] }
quote = "0.5"
schala-repl = { path = "../schala-repl" }
[lib]
proc-macro = true

102
schala-codegen/src/lib.rs Normal file
View File

@@ -0,0 +1,102 @@
#![feature(trace_macros)]
#![feature(proc_macro)]
extern crate proc_macro;
#[macro_use]
extern crate quote;
extern crate syn;
extern crate schala_repl;
use proc_macro::TokenStream;
use syn::{Ident, Attribute, DeriveInput};
fn extract_attribute_arg_by_name(name: &str, attrs: &Vec<Attribute>) -> Option<String> {
use syn::{Meta, Lit, MetaNameValue};
attrs.iter().map(|attr| attr.interpret_meta()).find(|meta| {
match meta {
&Some(Meta::NameValue(MetaNameValue { ident, .. })) if ident.as_ref() == name => true,
_ => false
}
}).and_then(|meta| {
match meta {
Some(Meta::NameValue(MetaNameValue { lit: Lit::Str(litstr), .. })) => Some(litstr.value()),
_ => None,
}
})
}
fn extract_attribute_list(name: &str, attrs: &Vec<Attribute>) -> Option<Vec<(Ident, Option<Vec<Ident>>)>> {
use syn::{Meta, MetaList, NestedMeta};
attrs.iter().find(|attr| {
match attr.path.segments.iter().nth(0) {
Some(segment) if segment.ident.as_ref() == name => true,
_ => false
}
}).and_then(|attr| {
match attr.interpret_meta() {
Some(Meta::List(MetaList { nested, .. })) => {
Some(nested.iter().map(|nested_meta| match nested_meta {
&NestedMeta::Meta(Meta::Word(ident)) => (ident, None),
&NestedMeta::Meta(Meta::List(MetaList { ident, nested: ref nested2, .. })) => {
let own_args = nested2.iter().map(|nested_meta2| match nested_meta2 {
&NestedMeta::Meta(Meta::Word(ident)) => ident,
_ => panic!("Bad format for doubly-nested attribute list")
}).collect();
(ident, Some(own_args))
},
_ => panic!("Bad format for nested list")
}).collect())
},
_ => panic!("{} must be a comma-delimited list surrounded by parens", name)
}
})
}
#[proc_macro_derive(ProgrammingLanguageInterface, attributes(LanguageName, SourceFileExtension, PipelineSteps))]
pub fn derive_programming_language_interface(input: TokenStream) -> TokenStream {
let ast: DeriveInput = syn::parse(input).unwrap();
let name = &ast.ident;
let attrs = &ast.attrs;
let language_name: String = extract_attribute_arg_by_name("LanguageName", attrs).expect("LanguageName is required");
let file_ext = extract_attribute_arg_by_name("SourceFileExtension", attrs).expect("SourceFileExtension is required");
let passes = extract_attribute_list("PipelineSteps", attrs).expect("PipelineSteps are required");
let pass_idents = passes.iter().map(|x| x.0);
//let pass_names: Vec<String> = passes.iter().map(|pass| pass.0.to_string()).collect();
let pass_descriptors = passes.iter().map(|pass| {
let name = pass.0.to_string();
let opts: Vec<String> = match &pass.1 {
None => vec![],
Some(opts) => opts.iter().map(|o| o.to_string()).collect(),
};
quote! {
PassDescriptor {
name: #name.to_string(),
debug_options: vec![#(format!(#opts)),*]
}
}
});
let tokens = quote! {
use schala_repl::PassDescriptor;
impl ProgrammingLanguageInterface for #name {
fn get_language_name(&self) -> String {
#language_name.to_string()
}
fn get_source_file_suffix(&self) -> String {
#file_ext.to_string()
}
fn execute_pipeline(&mut self, input: &str, options: &EvalOptions) -> FinishedComputation {
let mut chain = pass_chain![self, options; #(#pass_idents),* ];
chain(input)
}
fn get_passes(&self) -> Vec<PassDescriptor> {
vec![ #(#pass_descriptors),* ]
//vec![ #(PassDescriptor { name: #pass_names.to_string(), debug_options: vec![] }),* ]
}
}
};
tokens.into()
}

13
schala-lang/Cargo.toml Normal file
View File

@@ -0,0 +1,13 @@
[package]
name = "schala-lang"
version = "0.1.0"
authors = ["greg <greg.shuflin@protonmail.com>"]
[dependencies]
itertools = "0.5.8"
take_mut = "0.1.3"
maplit = "*"
lazy_static = "0.2.8"
schala-repl = { path = "../schala-repl" }
schala-codegen = { path = "../schala-codegen" }

View File

@@ -1,12 +0,0 @@
[package]
name = "schala-lang-codegen"
version = "0.1.0"
authors = ["greg <greg.shuflin@protonmail.com>"]
edition = "2018"
[lib]
proc-macro = true
[dependencies]
syn = { version = "0.15.12", features = ["full", "extra-traits", "fold"] }
quote = "0.6.8"

View File

@@ -1,50 +0,0 @@
#![feature(box_patterns)]
#![recursion_limit="128"]
extern crate proc_macro;
#[macro_use]
extern crate quote;
#[macro_use]
extern crate syn;
use self::proc_macro::TokenStream;
use self::syn::fold::Fold;
struct RecursiveDescentFn {
}
impl Fold for RecursiveDescentFn {
fn fold_item_fn(&mut self, mut i: syn::ItemFn) -> syn::ItemFn {
let box block = i.block;
let ref ident = i.ident;
let new_block: syn::Block = parse_quote! {
{
let next_token_before_parse = self.token_handler.peek();
let record = ParseRecord {
production_name: stringify!(#ident).to_string(),
next_token: format!("{}", next_token_before_parse.to_string_with_metadata()),
level: self.parse_level,
};
self.parse_level += 1;
self.parse_record.push(record);
let result = { #block };
if self.parse_level != 0 {
self.parse_level -= 1;
}
result
}
};
i.block = Box::new(new_block);
i
}
}
#[proc_macro_attribute]
pub fn recursive_descent_method(_attr: TokenStream, item: TokenStream) -> TokenStream {
let input: syn::ItemFn = parse_macro_input!(item as syn::ItemFn);
let mut folder = RecursiveDescentFn {};
let output = folder.fold_item_fn(input);
TokenStream::from(quote!(#output))
}

View File

@@ -1,18 +0,0 @@
[package]
name = "schala-lang"
version = "0.1.0"
authors = ["greg <greg.shuflin@protonmail.com>"]
edition = "2018"
[dependencies]
itertools = "0.8.0"
take_mut = "0.2.2"
maplit = "1.0.1"
lazy_static = "1.3.0"
failure = "0.1.5"
ena = "0.11.0"
stopwatch = "0.0.7"
derivative = "1.0.3"
schala-lang-codegen = { path = "../codegen" }
schala-repl = { path = "../../schala-repl" }

View File

@@ -1,310 +0,0 @@
use std::rc::Rc;
use std::convert::From;
use crate::derivative::Derivative;
use crate::symbol_table::FullyQualifiedSymbolName;
mod operators;
pub use operators::*;
/// An abstract identifier for an AST node
#[derive(Debug, PartialEq, Eq, Hash, Clone)]
pub struct ItemId {
idx: u32,
}
impl ItemId {
fn new(n: u32) -> ItemId {
ItemId { idx: n }
}
}
pub struct ItemIdStore {
last_idx: u32
}
impl ItemIdStore {
pub fn new() -> ItemIdStore {
ItemIdStore { last_idx: 0 }
}
/// Always returns an ItemId with internal value zero
#[cfg(test)]
pub fn new_id() -> ItemId {
ItemId { idx: 0 }
}
/// This limits the size of the AST to 2^32 tree elements
pub fn fresh(&mut self) -> ItemId {
let idx = self.last_idx;
self.last_idx += 1;
ItemId::new(idx)
}
}
#[derive(Clone, Debug, PartialEq)]
pub struct Meta<T> {
pub n: T,
pub fqsn: Option<FullyQualifiedSymbolName>
}
impl<T> Meta<T> {
pub fn new(n: T) -> Meta<T> {
Meta { n,
fqsn: None,
}
}
pub fn node(&self) -> &T {
&self.n
}
pub fn mut_node(&mut self) -> &mut T {
&mut self.n
}
}
//TODO this PartialEq is here to make tests work - find a way to make it not necessary
#[derive(Clone, Debug, Default, PartialEq)]
struct SourceMap {
}
impl From<Expression> for Meta<Expression> {
fn from(expr: Expression) -> Meta<Expression> {
Meta::new(expr)
}
}
#[derive(Derivative, Debug)]
#[derivative(PartialEq)]
pub struct AST {
#[derivative(PartialEq="ignore")]
pub id: ItemId,
pub statements: Vec<Statement>
}
#[derive(Derivative, Debug, Clone)]
#[derivative(PartialEq)]
pub struct Statement {
#[derivative(PartialEq="ignore")]
pub id: ItemId,
pub kind: StatementKind,
}
#[derive(Debug, PartialEq, Clone)]
pub enum StatementKind {
Expression(Expression),
Declaration(Declaration), //TODO Declaration should also be Meta-wrapped; only Expression and Declaration are Meta-wrapped maybe?
}
pub type Block = Vec<Statement>;
pub type ParamName = Rc<String>;
#[derive(Debug, Derivative, Clone)]
#[derivative(PartialEq)]
pub struct QualifiedName {
#[derivative(PartialEq="ignore")]
pub id: ItemId,
pub components: Vec<Rc<String>>,
}
#[derive(Debug, PartialEq, Clone)]
pub struct FormalParam {
pub name: ParamName,
pub default: Option<Expression>,
pub anno: Option<TypeIdentifier>
}
#[derive(Debug, PartialEq, Clone)]
pub enum Declaration {
FuncSig(Signature),
FuncDecl(Signature, Block),
TypeDecl {
name: TypeSingletonName,
body: TypeBody,
mutable: bool
},
TypeAlias(Rc<String>, Rc<String>), //should have TypeSingletonName in it, or maybe just String, not sure
Binding {
name: Rc<String>,
constant: bool,
type_anno: Option<TypeIdentifier>,
expr: Expression,
},
Impl {
type_name: TypeIdentifier,
interface_name: Option<TypeSingletonName>,
block: Vec<Declaration>,
},
Interface {
name: Rc<String>,
signatures: Vec<Signature>
}
}
#[derive(Debug, PartialEq, Clone)]
pub struct Signature {
pub name: Rc<String>,
pub operator: bool,
pub params: Vec<FormalParam>,
pub type_anno: Option<TypeIdentifier>,
}
#[derive(Debug, PartialEq, Clone)]
pub struct TypeBody(pub Vec<Variant>);
#[derive(Debug, PartialEq, Clone)]
pub enum Variant {
UnitStruct(Rc<String>),
TupleStruct(Rc<String>, Vec<TypeIdentifier>),
Record {
name: Rc<String>,
members: Vec<(Rc<String>, TypeIdentifier)>,
}
}
#[derive(Debug, Derivative, Clone)]
#[derivative(PartialEq)]
pub struct Expression {
#[derivative(PartialEq="ignore")]
pub id: ItemId,
pub kind: ExpressionKind,
pub type_anno: Option<TypeIdentifier>
}
impl Expression {
pub fn new(id: ItemId, kind: ExpressionKind) -> Expression {
Expression { id, kind, type_anno: None }
}
pub fn with_anno(id: ItemId, kind: ExpressionKind, type_anno: TypeIdentifier) -> Expression {
Expression { id, kind, type_anno: Some(type_anno) }
}
}
#[derive(Debug, PartialEq, Clone)]
pub enum TypeIdentifier {
Tuple(Vec<TypeIdentifier>),
Singleton(TypeSingletonName)
}
#[derive(Debug, PartialEq, Clone)]
pub struct TypeSingletonName {
pub name: Rc<String>,
pub params: Vec<TypeIdentifier>,
}
#[derive(Debug, PartialEq, Clone)]
pub enum ExpressionKind {
NatLiteral(u64),
FloatLiteral(f64),
StringLiteral(Rc<String>),
BoolLiteral(bool),
BinExp(BinOp, Box<Expression>, Box<Expression>),
PrefixExp(PrefixOp, Box<Expression>),
TupleLiteral(Vec<Expression>),
Value(QualifiedName),
NamedStruct {
name: QualifiedName,
fields: Vec<(Rc<String>, Expression)>,
},
Call {
f: Box<Expression>,
arguments: Vec<InvocationArgument>,
},
Index {
indexee: Box<Expression>,
indexers: Vec<Expression>,
},
IfExpression {
discriminator: Box<Discriminator>,
body: Box<IfExpressionBody>,
},
WhileExpression {
condition: Option<Box<Expression>>,
body: Block,
},
ForExpression {
enumerators: Vec<Enumerator>,
body: Box<ForBody>,
},
Lambda {
params: Vec<FormalParam>,
type_anno: Option<TypeIdentifier>,
body: Block,
},
ListLiteral(Vec<Expression>),
}
#[derive(Debug, PartialEq, Clone)]
pub enum InvocationArgument {
Positional(Expression),
Keyword {
name: Rc<String>,
expr: Expression,
},
Ignored
}
#[derive(Debug, PartialEq, Clone)]
pub enum Discriminator {
Simple(Expression),
BinOp(Expression, BinOp)
}
#[derive(Debug, PartialEq, Clone)]
pub enum IfExpressionBody {
SimpleConditional(Block, Option<Block>),
SimplePatternMatch(Pattern, Block, Option<Block>),
GuardList(Vec<GuardArm>)
}
#[derive(Debug, PartialEq, Clone)]
pub struct GuardArm {
pub guard: Guard,
pub body: Block,
}
#[derive(Debug, PartialEq, Clone)]
pub enum Guard {
Pat(Pattern),
HalfExpr(HalfExpr)
}
#[derive(Debug, PartialEq, Clone)]
pub struct HalfExpr {
pub op: Option<BinOp>,
pub expr: ExpressionKind,
}
#[derive(Debug, PartialEq, Clone)]
pub enum Pattern {
Ignored,
TuplePattern(Vec<Pattern>),
Literal(PatternLiteral),
TupleStruct(QualifiedName, Vec<Pattern>),
Record(QualifiedName, Vec<(Rc<String>, Pattern)>),
VarOrName(QualifiedName),
}
#[derive(Debug, PartialEq, Clone)]
pub enum PatternLiteral {
NumPattern {
neg: bool,
num: ExpressionKind,
},
StringPattern(Rc<String>),
BoolPattern(bool),
}
#[derive(Debug, PartialEq, Clone)]
pub struct Enumerator {
pub id: Rc<String>,
pub generator: Expression,
}
#[derive(Debug, PartialEq, Clone)]
pub enum ForBody {
MonadicReturn(Expression),
StatementBlock(Block),
}

View File

@@ -1,112 +0,0 @@
use std::rc::Rc;
use std::str::FromStr;
use crate::tokenizing::TokenKind;
use crate::builtin::Builtin;
#[derive(Debug, PartialEq, Clone)]
pub struct PrefixOp {
sigil: Rc<String>,
pub builtin: Option<Builtin>,
}
impl PrefixOp {
#[allow(dead_code)]
pub fn sigil(&self) -> &Rc<String> {
&self.sigil
}
pub fn is_prefix(op: &str) -> bool {
match op {
"+" => true,
"-" => true,
"!" => true,
_ => false
}
}
}
impl FromStr for PrefixOp {
type Err = ();
fn from_str(s: &str) -> Result<Self, Self::Err> {
use Builtin::*;
let builtin = match s {
"+" => Ok(Increment),
"-" => Ok(Negate),
"!" => Ok(BooleanNot),
_ => Err(())
};
builtin.map(|builtin| PrefixOp { sigil: Rc::new(s.to_string()), builtin: Some(builtin) })
}
}
#[derive(Debug, PartialEq, Clone)]
pub struct BinOp {
sigil: Rc<String>,
pub builtin: Option<Builtin>,
}
impl BinOp {
pub fn from_sigil(sigil: &str) -> BinOp {
let builtin = Builtin::from_str(sigil).ok();
BinOp { sigil: Rc::new(sigil.to_string()), builtin }
}
pub fn sigil(&self) -> &Rc<String> {
&self.sigil
}
pub fn from_sigil_token(tok: &TokenKind) -> Option<BinOp> {
let s = token_kind_to_sigil(tok)?;
Some(BinOp::from_sigil(s))
}
pub fn min_precedence() -> i32 {
i32::min_value()
}
pub fn get_precedence_from_token(op_tok: &TokenKind) -> Option<i32> {
let s = token_kind_to_sigil(op_tok)?;
Some(binop_precedences(s))
}
pub fn get_precedence(&self) -> i32 {
binop_precedences(&self.sigil)
}
}
fn token_kind_to_sigil<'a>(tok: &'a TokenKind) -> Option<&'a str> {
use self::TokenKind::*;
Some(match tok {
Operator(op) => op.as_str(),
Period => ".",
Pipe => "|",
Slash => "/",
LAngleBracket => "<",
RAngleBracket => ">",
Equals => "=",
_ => return None
})
}
fn binop_precedences(s: &str) -> i32 {
let default = 10_000_000;
match s {
"+" => 10,
"-" => 10,
"*" => 20,
"/" => 20,
"%" => 20,
"++" => 30,
"^" => 30,
"&" => 20,
"|" => 20,
">" => 20,
">=" => 20,
"<" => 20,
"<=" => 20,
"==" => 40,
"=" => 10,
"<=>" => 30,
_ => default,
}
}

View File

@@ -1,102 +0,0 @@
use std::str::FromStr;
use crate::typechecking::{TypeConst, Type};
#[derive(Debug, Clone, Copy, PartialEq)]
pub enum Builtin {
Add,
Increment,
Subtract,
Negate,
Multiply,
Divide,
Quotient,
Modulo,
Exponentiation,
BitwiseAnd,
BitwiseOr,
BooleanAnd,
BooleanOr,
BooleanNot,
Equality,
LessThan,
LessThanOrEqual,
GreaterThan,
GreaterThanOrEqual,
Comparison,
FieldAccess,
IOPrint,
IOPrintLn,
IOGetLine,
Assignment,
Concatenate,
}
impl Builtin {
pub fn get_type(&self) -> Type {
use Builtin::*;
match self {
Add => ty!(Nat -> Nat -> Nat),
Subtract => ty!(Nat -> Nat -> Nat),
Multiply => ty!(Nat -> Nat -> Nat),
Divide => ty!(Nat -> Nat -> Float),
Quotient => ty!(Nat -> Nat -> Nat),
Modulo => ty!(Nat -> Nat -> Nat),
Exponentiation => ty!(Nat -> Nat -> Nat),
BitwiseAnd => ty!(Nat -> Nat -> Nat),
BitwiseOr => ty!(Nat -> Nat -> Nat),
BooleanAnd => ty!(Bool -> Bool -> Bool),
BooleanOr => ty!(Bool -> Bool -> Bool),
BooleanNot => ty!(Bool -> Bool),
Equality => ty!(Nat -> Nat -> Bool),
LessThan => ty!(Nat -> Nat -> Bool),
LessThanOrEqual => ty!(Nat -> Nat -> Bool),
GreaterThan => ty!(Nat -> Nat -> Bool),
GreaterThanOrEqual => ty!(Nat -> Nat -> Bool),
Comparison => ty!(Nat -> Nat -> Ordering),
FieldAccess => ty!(Unit),
IOPrint => ty!(Unit),
IOPrintLn => ty!(Unit) ,
IOGetLine => ty!(StringT),
Assignment => ty!(Unit),
Concatenate => ty!(StringT -> StringT -> StringT),
Increment => ty!(Nat -> Int),
Negate => ty!(Nat -> Int)
}
}
}
impl FromStr for Builtin {
type Err = ();
fn from_str(s: &str) -> Result<Self, Self::Err> {
use Builtin::*;
Ok(match s {
"+" => Add,
"-" => Subtract,
"*" => Multiply,
"/" => Divide,
"quot" => Quotient,
"%" => Modulo,
"++" => Concatenate,
"^" => Exponentiation,
"&" => BitwiseAnd,
"&&" => BooleanAnd,
"|" => BitwiseOr,
"||" => BooleanOr,
"!" => BooleanNot,
">" => GreaterThan,
">=" => GreaterThanOrEqual,
"<" => LessThan,
"<=" => LessThanOrEqual,
"==" => Equality,
"=" => Assignment,
"<=>" => Comparison,
"." => FieldAccess,
"print" => IOPrint,
"println" => IOPrintLn,
"getline" => IOGetLine,
_ => return Err(())
})
}
}

View File

@@ -1,10 +0,0 @@
use crate::ast::*;
impl AST {
pub fn compact_debug(&self) -> String {
format!("{:?}", self)
}
pub fn expanded_debug(&self) -> String {
format!("{:#?}", self)
}
}

View File

@@ -1,502 +0,0 @@
use std::cell::RefCell;
use std::rc::Rc;
use std::fmt::Write;
use std::io;
use itertools::Itertools;
use crate::util::ScopeStack;
use crate::reduced_ast::{BoundVars, ReducedAST, Stmt, Expr, Lit, Func, Alternative, Subpattern};
use crate::symbol_table::{SymbolSpec, Symbol, SymbolTable, ScopeSegment, ScopeSegmentKind, FullyQualifiedSymbolName};
use crate::builtin::Builtin;
mod test;
pub struct State<'a> {
values: ScopeStack<'a, Rc<String>, ValueEntry>,
symbol_table_handle: Rc<RefCell<SymbolTable>>,
}
impl<'a> State<'a> {
pub fn new(symbol_table_handle: Rc<RefCell<SymbolTable>>) -> State<'a> {
let values = ScopeStack::new(Some(format!("global")));
State { values, symbol_table_handle }
}
pub fn debug_print(&self) -> String {
format!("Values: {:?}", self.values)
}
fn new_frame(&'a self, items: &'a Vec<Node>, bound_vars: &BoundVars) -> State<'a> {
let mut inner_state = State {
values: self.values.new_scope(None),
symbol_table_handle: self.symbol_table_handle.clone(),
};
for (bound_var, val) in bound_vars.iter().zip(items.iter()) {
if let Some(bv) = bound_var.as_ref() {
inner_state.values.insert(bv.clone(), ValueEntry::Binding { constant: true, val: val.clone() });
}
}
inner_state
}
}
#[derive(Debug, Clone)]
enum Node {
Expr(Expr),
PrimObject {
name: Rc<String>,
tag: usize,
items: Vec<Node>,
},
PrimTuple {
items: Vec<Node>
}
}
fn paren_wrapped_vec(terms: impl Iterator<Item=String>) -> String {
let mut buf = String::new();
write!(buf, "(").unwrap();
for term in terms.map(|e| Some(e)).intersperse(None) {
match term {
Some(e) => write!(buf, "{}", e).unwrap(),
None => write!(buf, ", ").unwrap(),
};
}
write!(buf, ")").unwrap();
buf
}
impl Node {
fn to_repl(&self, symbol_table: &SymbolTable) -> String {
match self {
Node::Expr(e) => e.to_repl(symbol_table),
Node::PrimObject { name, items, .. } if items.len() == 0 => format!("{}", name),
Node::PrimObject { name, items, .. } => format!("{}{}", name, paren_wrapped_vec(items.iter().map(|x| x.to_repl(symbol_table)))),
Node::PrimTuple { items } => format!("{}", paren_wrapped_vec(items.iter().map(|x| x.to_repl(symbol_table)))),
}
}
fn is_true(&self) -> bool {
match self {
Node::Expr(Expr::Lit(crate::reduced_ast::Lit::Bool(true))) => true,
_ => false,
}
}
}
#[derive(Debug)]
enum ValueEntry {
Binding {
constant: bool,
val: /*FullyEvaluatedExpr*/ Node, //TODO make this use a subtype to represent fully evaluatedness
}
}
type EvalResult<T> = Result<T, String>;
impl Expr {
fn to_node(self) -> Node {
Node::Expr(self)
}
fn to_repl(&self, symbol_table: &SymbolTable) -> String {
use self::Lit::*;
use self::Func::*;
let _ = symbol_table;
match self {
Expr::Lit(ref l) => match l {
Nat(n) => format!("{}", n),
Int(i) => format!("{}", i),
Float(f) => format!("{}", f),
Bool(b) => format!("{}", b),
StringLit(s) => format!("\"{}\"", s),
},
Expr::Func(f) => match f {
BuiltIn(builtin) => format!("<built-in function '{:?}'>", builtin),
UserDefined { name: None, .. } => format!("<function>"),
UserDefined { name: Some(name), .. } => format!("<function '{}'>", name),
},
Expr::Constructor { type_name, arity, .. } => {
format!("<constructor for `{}` arity {}>", type_name, arity)
},
Expr::Tuple(exprs) => paren_wrapped_vec(exprs.iter().map(|x| x.to_repl(symbol_table))),
_ => format!("{:?}", self),
}
}
fn replace_conditional_target_sigil(self, replacement: &Expr) -> Expr {
use self::Expr::*;
match self {
ConditionalTargetSigilValue => replacement.clone(),
Unit | Lit(_) | Func(_) | Sym(_) | Constructor { .. } |
CaseMatch { .. } | UnimplementedSigilValue | ReductionError(_) => self,
Tuple(exprs) => Tuple(exprs.into_iter().map(|e| e.replace_conditional_target_sigil(replacement)).collect()),
Call { f, args } => {
let new_args = args.into_iter().map(|e| e.replace_conditional_target_sigil(replacement)).collect();
Call { f, args: new_args }
},
Conditional { .. } => panic!("Dunno if I need this, but if so implement"),
Assign { .. } => panic!("I'm pretty sure I don't need this"),
}
}
}
impl<'a> State<'a> {
pub fn evaluate(&mut self, ast: ReducedAST, repl: bool) -> Vec<Result<String, String>> {
let mut acc = vec![];
// handle prebindings
for statement in ast.0.iter() {
self.prebinding(statement);
}
for statement in ast.0 {
match self.statement(statement) {
Ok(Some(ref output)) if repl => {
let ref symbol_table = self.symbol_table_handle.borrow();
acc.push(Ok(output.to_repl(symbol_table)))
},
Ok(_) => (),
Err(error) => {
acc.push(Err(format!("Runtime error: {}", error)));
return acc;
},
}
}
acc
}
fn prebinding(&mut self, stmt: &Stmt) {
match stmt {
Stmt::PreBinding { name, func } => {
let v_entry = ValueEntry::Binding { constant: true, val: Node::Expr(Expr::Func(func.clone())) };
self.values.insert(name.clone(), v_entry);
},
Stmt::Expr(_expr) => {
//TODO have this support things like nested function defs
},
_ => ()
}
}
fn statement(&mut self, stmt: Stmt) -> EvalResult<Option<Node>> {
match stmt {
Stmt::Binding { name, constant, expr } => {
let val = self.expression(Node::Expr(expr))?;
self.values.insert(name.clone(), ValueEntry::Binding { constant, val });
Ok(None)
},
Stmt::Expr(expr) => Ok(Some(self.expression(expr.to_node())?)),
Stmt::PreBinding {..} | Stmt::Noop => Ok(None),
}
}
fn block(&mut self, stmts: Vec<Stmt>) -> EvalResult<Node> {
let mut ret = None;
for stmt in stmts {
ret = self.statement(stmt)?;
}
Ok(ret.unwrap_or(Node::Expr(Expr::Unit)))
}
fn expression(&mut self, node: Node) -> EvalResult<Node> {
use self::Expr::*;
match node {
t @ Node::PrimTuple { .. } => Ok(t),
obj @ Node::PrimObject { .. } => Ok(obj),
Node::Expr(expr) => match expr {
literal @ Lit(_) => Ok(Node::Expr(literal)),
Call { box f, args } => self.call_expression(f, args),
Sym(v) => self.handle_sym(v),
Constructor { arity, ref name, tag, .. } if arity == 0 => Ok(Node::PrimObject { name: name.clone(), tag, items: vec![] }),
constructor @ Constructor { .. } => Ok(Node::Expr(constructor)),
func @ Func(_) => Ok(Node::Expr(func)),
Tuple(exprs) => {
let nodes = exprs.into_iter().map(|expr| self.expression(Node::Expr(expr))).collect::<Result<Vec<Node>,_>>()?;
Ok(Node::PrimTuple { items: nodes })
},
Conditional { box cond, then_clause, else_clause } => self.conditional(cond, then_clause, else_clause),
Assign { box val, box expr } => self.assign_expression(val, expr),
Unit => Ok(Node::Expr(Unit)),
CaseMatch { box cond, alternatives } => self.case_match_expression(cond, alternatives),
ConditionalTargetSigilValue => Ok(Node::Expr(ConditionalTargetSigilValue)),
UnimplementedSigilValue => Err(format!("Sigil value eval not implemented")),
ReductionError(err) => Err(format!("Reduction error: {}", err)),
}
}
}
fn call_expression(&mut self, f: Expr, args: Vec<Expr>) -> EvalResult<Node> {
use self::Expr::*;
match self.expression(Node::Expr(f))? {
Node::Expr(Constructor { type_name, name, tag, arity }) => self.apply_data_constructor(type_name, name, tag, arity, args),
Node::Expr(Func(f)) => self.apply_function(f, args),
other => return Err(format!("Tried to call {:?} which is not a function or data constructor", other)),
}
}
fn apply_data_constructor(&mut self, _type_name: Rc<String>, name: Rc<String>, tag: usize, arity: usize, args: Vec<Expr>) -> EvalResult<Node> {
if arity != args.len() {
return Err(format!("Data constructor {} requires {} arg(s)", name, arity));
}
let evaled_args = args.into_iter().map(|expr| self.expression(Node::Expr(expr))).collect::<Result<Vec<Node>,_>>()?;
//let evaled_args = vec![];
Ok(Node::PrimObject {
name: name.clone(),
items: evaled_args,
tag
})
}
fn apply_function(&mut self, f: Func, args: Vec<Expr>) -> EvalResult<Node> {
match f {
Func::BuiltIn(builtin) => Ok(self.apply_builtin(builtin, args)?),
Func::UserDefined { params, body, name } => {
if params.len() != args.len() {
return Err(format!("calling a {}-argument function with {} args", params.len(), args.len()))
}
let mut func_state = State {
values: self.values.new_scope(name.map(|n| format!("{}", n))),
symbol_table_handle: self.symbol_table_handle.clone(),
};
for (param, val) in params.into_iter().zip(args.into_iter()) {
let val = func_state.expression(Node::Expr(val))?;
func_state.values.insert(param, ValueEntry::Binding { constant: true, val });
}
// TODO figure out function return semantics
func_state.block(body)
}
}
}
fn apply_builtin(&mut self, builtin: Builtin, args: Vec<Expr>) -> EvalResult<Node> {
use self::Expr::*;
use self::Lit::*;
use Builtin::*;
let evaled_args: Result<Vec<Node>, String> = args.into_iter().map(|arg| self.expression(arg.to_node()))
.collect();
let evaled_args = evaled_args?;
Ok(match (builtin, evaled_args.as_slice()) {
(FieldAccess, &[Node::PrimObject { .. }]) => {
//TODO implement field access
unimplemented!()
},
(binop, &[Node::Expr(ref lhs), Node::Expr(ref rhs)]) => match (binop, lhs, rhs) {
/* binops */
(Add, Lit(Nat(l)), Lit(Nat(r))) => Lit(Nat(l + r)),
(Concatenate, Lit(StringLit(ref s1)), Lit(StringLit(ref s2))) => Lit(StringLit(Rc::new(format!("{}{}", s1, s2)))),
(Subtract, Lit(Nat(l)), Lit(Nat(r))) => Lit(Nat(l - r)),
(Multiply, Lit(Nat(l)), Lit(Nat(r))) => Lit(Nat(l * r)),
(Divide, Lit(Nat(l)), Lit(Nat(r))) => Lit(Float((*l as f64)/ (*r as f64))),
(Quotient, Lit(Nat(l)), Lit(Nat(r))) => if *r == 0 {
return Err(format!("divide by zero"));
} else {
Lit(Nat(l / r))
},
(Modulo, Lit(Nat(l)), Lit(Nat(r))) => Lit(Nat(l % r)),
(Exponentiation, Lit(Nat(l)), Lit(Nat(r))) => Lit(Nat(l ^ r)),
(BitwiseAnd, Lit(Nat(l)), Lit(Nat(r))) => Lit(Nat(l & r)),
(BitwiseOr, Lit(Nat(l)), Lit(Nat(r))) => Lit(Nat(l | r)),
/* comparisons */
(Equality, Lit(Nat(l)), Lit(Nat(r))) => Lit(Bool(l == r)),
(Equality, Lit(Int(l)), Lit(Int(r))) => Lit(Bool(l == r)),
(Equality, Lit(Float(l)), Lit(Float(r))) => Lit(Bool(l == r)),
(Equality, Lit(Bool(l)), Lit(Bool(r))) => Lit(Bool(l == r)),
(Equality, Lit(StringLit(ref l)), Lit(StringLit(ref r))) => Lit(Bool(l == r)),
(LessThan, Lit(Nat(l)), Lit(Nat(r))) => Lit(Bool(l < r)),
(LessThan, Lit(Int(l)), Lit(Int(r))) => Lit(Bool(l < r)),
(LessThan, Lit(Float(l)), Lit(Float(r))) => Lit(Bool(l < r)),
(LessThanOrEqual, Lit(Nat(l)), Lit(Nat(r))) => Lit(Bool(l <= r)),
(LessThanOrEqual, Lit(Int(l)), Lit(Int(r))) => Lit(Bool(l <= r)),
(LessThanOrEqual, Lit(Float(l)), Lit(Float(r))) => Lit(Bool(l <= r)),
(GreaterThan, Lit(Nat(l)), Lit(Nat(r))) => Lit(Bool(l > r)),
(GreaterThan, Lit(Int(l)), Lit(Int(r))) => Lit(Bool(l > r)),
(GreaterThan, Lit(Float(l)), Lit(Float(r))) => Lit(Bool(l > r)),
(GreaterThanOrEqual, Lit(Nat(l)), Lit(Nat(r))) => Lit(Bool(l >= r)),
(GreaterThanOrEqual, Lit(Int(l)), Lit(Int(r))) => Lit(Bool(l >= r)),
(GreaterThanOrEqual, Lit(Float(l)), Lit(Float(r))) => Lit(Bool(l >= r)),
_ => return Err("No valid binop".to_string())
}.to_node(),
(prefix, &[Node::Expr(ref arg)]) => match (prefix, arg) {
(BooleanNot, Lit(Bool(true))) => Lit(Bool(false)),
(BooleanNot, Lit(Bool(false))) => Lit(Bool(true)),
(Negate, Lit(Nat(n))) => Lit(Int(-1*(*n as i64))),
(Negate, Lit(Int(n))) => Lit(Int(-1*(*n as i64))),
(Increment, Lit(Int(n))) => Lit(Int(*n)),
(Increment, Lit(Nat(n))) => Lit(Nat(*n)),
_ => return Err("No valid prefix op".to_string())
}.to_node(),
/* builtin functions */
(IOPrint, &[ref anything]) => {
let ref symbol_table = self.symbol_table_handle.borrow();
print!("{}", anything.to_repl(symbol_table));
Expr::Unit.to_node()
},
(IOPrintLn, &[ref anything]) => {
let ref symbol_table = self.symbol_table_handle.borrow();
println!("{}", anything.to_repl(symbol_table));
Expr::Unit.to_node()
},
(IOGetLine, &[]) => {
let mut buf = String::new();
io::stdin().read_line(&mut buf).expect("Error readling line in 'getline'");
Lit(StringLit(Rc::new(buf.trim().to_string()))).to_node()
},
(x, args) => return Err(format!("bad or unimplemented builtin {:?} | {:?}", x, args)),
})
}
fn conditional(&mut self, cond: Expr, then_clause: Vec<Stmt>, else_clause: Vec<Stmt>) -> EvalResult<Node> {
let cond = self.expression(Node::Expr(cond))?;
Ok(match cond {
Node::Expr(Expr::Lit(Lit::Bool(true))) => self.block(then_clause)?,
Node::Expr(Expr::Lit(Lit::Bool(false))) => self.block(else_clause)?,
_ => return Err(format!("Conditional with non-boolean condition"))
})
}
fn assign_expression(&mut self, val: Expr, expr: Expr) -> EvalResult<Node> {
let name = match val {
Expr::Sym(name) => name,
_ => return Err(format!("Trying to assign to a non-value")),
};
let constant = match self.values.lookup(&name) {
None => return Err(format!("Constant {} is undefined", name)),
Some(ValueEntry::Binding { constant, .. }) => constant.clone(),
};
if constant {
return Err(format!("trying to update {}, a non-mutable binding", name));
}
let val = self.expression(Node::Expr(expr))?;
self.values.insert(name.clone(), ValueEntry::Binding { constant: false, val });
Ok(Node::Expr(Expr::Unit))
}
fn guard_passes(&mut self, guard: &Option<Expr>, cond: &Node) -> EvalResult<bool> {
if let Some(ref guard_expr) = guard {
let guard_expr = match cond {
Node::Expr(ref e) => guard_expr.clone().replace_conditional_target_sigil(e),
_ => guard_expr.clone()
};
Ok(self.expression(guard_expr.to_node())?.is_true())
} else {
Ok(true)
}
}
fn case_match_expression(&mut self, cond: Expr, alternatives: Vec<Alternative>) -> EvalResult<Node> {
//TODO need to handle recursive subpatterns
let all_subpatterns_pass = |state: &mut State, subpatterns: &Vec<Option<Subpattern>>, items: &Vec<Node>| -> EvalResult<bool> {
if subpatterns.len() == 0 {
return Ok(true)
}
if items.len() != subpatterns.len() {
return Err(format!("Subpattern length isn't correct items {} subpatterns {}", items.len(), subpatterns.len()));
}
for (maybe_subp, cond) in subpatterns.iter().zip(items.iter()) {
if let Some(subp) = maybe_subp {
if !state.guard_passes(&subp.guard, &cond)? {
return Ok(false)
}
}
}
Ok(true)
};
let cond = self.expression(Node::Expr(cond))?;
for alt in alternatives {
// no matter what type of condition we have, ignore alternative if the guard evaluates false
if !self.guard_passes(&alt.matchable.guard, &cond)? {
continue;
}
match cond {
Node::PrimObject { ref tag, ref items, .. } => {
if alt.matchable.tag.map(|t| t == *tag).unwrap_or(true) {
let mut inner_state = self.new_frame(items, &alt.matchable.bound_vars);
if all_subpatterns_pass(&mut inner_state, &alt.matchable.subpatterns, items)? {
return inner_state.block(alt.item);
} else {
continue;
}
}
},
Node::PrimTuple { ref items } => {
let mut inner_state = self.new_frame(items, &alt.matchable.bound_vars);
if all_subpatterns_pass(&mut inner_state, &alt.matchable.subpatterns, items)? {
return inner_state.block(alt.item);
} else {
continue;
}
},
Node::Expr(ref _e) => {
if let None = alt.matchable.tag {
return self.block(alt.item)
}
}
}
}
Err(format!("{:?} failed pattern match", cond))
}
//TODO if I don't need to lookup by name here...
fn handle_sym(&mut self, name: Rc<String>) -> EvalResult<Node> {
use self::ValueEntry::*;
use self::Func::*;
//TODO add a layer of indirection here to talk to the symbol table first, and only then look up
//in the values table
let symbol_table = self.symbol_table_handle.borrow();
let value = symbol_table.lookup_by_fqsn(&fqsn!(name ; tr));
Ok(match value {
Some(Symbol { name, spec, .. }) => match spec {
//TODO I'll need this type_name later to do a table lookup
SymbolSpec::DataConstructor { type_name: _type_name, type_args, .. } => {
if type_args.len() == 0 {
Node::PrimObject { name: name.clone(), tag: 0, items: vec![] }
} else {
return Err(format!("This data constructor thing not done"))
}
},
SymbolSpec::Func(_) => match self.values.lookup(&name) {
Some(Binding { val: Node::Expr(Expr::Func(UserDefined { name, params, body })), .. }) => {
Node::Expr(Expr::Func(UserDefined { name: name.clone(), params: params.clone(), body: body.clone() }))
},
_ => unreachable!(),
},
SymbolSpec::RecordConstructor { .. } => return Err(format!("This shouldn't be a record!")),
SymbolSpec::Binding => match self.values.lookup(&name) {
Some(Binding { val, .. }) => val.clone(),
None => return Err(format!("Symbol {} exists in symbol table but not in evaluator table", name))
}
},
//TODO ideally this should be returning a runtime error if this is ever None, but it's not
//handling all bindings correctly yet
//None => return Err(format!("Couldn't find value {}", name)),
None => match self.values.lookup(&name) {
Some(Binding { val, .. }) => val.clone(),
None => return Err(format!("Couldn't find value {}", name)),
}
})
}
}

View File

@@ -1,260 +0,0 @@
#![cfg(test)]
use std::cell::RefCell;
use std::rc::Rc;
use crate::symbol_table::SymbolTable;
use crate::scope_resolution::ScopeResolver;
use crate::reduced_ast::reduce;
use crate::eval::State;
fn evaluate_all_outputs(input: &str) -> Vec<Result<String, String>> {
let symbol_table = Rc::new(RefCell::new(SymbolTable::new()));
let mut state = State::new(symbol_table);
let mut ast = crate::util::quick_ast(input);
state.symbol_table_handle.borrow_mut().add_top_level_symbols(&ast).unwrap();
{
let mut t = &mut state.symbol_table_handle.borrow_mut();
let mut scope_resolver = crate::scope_resolution::ScopeResolver::new(&mut t);
let _ = scope_resolver.resolve(&mut ast);
}
let reduced = reduce(&ast, &state.symbol_table_handle.borrow());
let all_output = state.evaluate(reduced, true);
all_output
}
macro_rules! test_in_fresh_env {
($string:expr, $correct:expr) => {
{
let all_output = evaluate_all_outputs($string);
let ref output = all_output.last().unwrap();
assert_eq!(**output, Ok($correct.to_string()));
}
}
}
#[test]
fn test_basic_eval() {
test_in_fresh_env!("1 + 2", "3");
test_in_fresh_env!("let mut a = 1; a = 2", "Unit");
/*
test_in_fresh_env!("let mut a = 1; a = 2; a", "2");
test_in_fresh_env!(r#"("a", 1 + 2)"#, r#"("a", 3)"#);
*/
}
#[test]
fn op_eval() {
test_in_fresh_env!("- 13", "-13");
test_in_fresh_env!("10 - 2", "8");
}
#[test]
fn function_eval() {
test_in_fresh_env!("fn oi(x) { x + 1 }; oi(4)", "5");
test_in_fresh_env!("fn oi(x) { x + 1 }; oi(1+2)", "4");
}
#[test]
fn scopes() {
let scope_ok = r#"
let a = 20
fn haha() {
let a = 10
a
}
haha()
"#;
test_in_fresh_env!(scope_ok, "10");
let scope_ok = r#"
let a = 20
fn haha() {
let a = 10
a
}
a
"#;
test_in_fresh_env!(scope_ok, "20");
}
#[test]
fn if_is_patterns() {
let source = r#"
type Option<T> = Some(T) | None
let x = Option::Some(9); if x is Option::Some(q) then { q } else { 0 }"#;
test_in_fresh_env!(source, "9");
let source = r#"
type Option<T> = Some(T) | None
let x = Option::None; if x is Option::Some(q) then { q } else { 0 }"#;
test_in_fresh_env!(source, "0");
}
#[test]
fn full_if_matching() {
let source = r#"
type Option<T> = Some(T) | None
let a = Option::None
if a { is Option::None -> 4, is Option::Some(x) -> x }
"#;
test_in_fresh_env!(source, "4");
let source = r#"
type Option<T> = Some(T) | None
let a = Option::Some(99)
if a { is Option::None -> 4, is Option::Some(x) -> x }
"#;
test_in_fresh_env!(source, "99");
let source = r#"
let a = 10
if a { is 10 -> "x", is 4 -> "y" }
"#;
test_in_fresh_env!(source, "\"x\"");
let source = r#"
let a = 10
if a { is 15 -> "x", is 10 -> "y" }
"#;
test_in_fresh_env!(source, "\"y\"");
}
#[test]
fn string_pattern() {
let source = r#"
let a = "foo"
if a { is "foo" -> "x", is _ -> "y" }
"#;
test_in_fresh_env!(source, "\"x\"");
}
#[test]
fn boolean_pattern() {
let source = r#"
let a = true
if a {
is true -> "x",
is false -> "y"
}
"#;
test_in_fresh_env!(source, "\"x\"");
}
#[test]
fn boolean_pattern_2() {
let source = r#"
let a = false
if a { is true -> "x", is false -> "y" }
"#;
test_in_fresh_env!(source, "\"y\"");
}
#[test]
fn ignore_pattern() {
let source = r#"
type Option<T> = Some(T) | None
if Option::Some(10) {
is _ -> "hella"
}
"#;
test_in_fresh_env!(source, "\"hella\"");
}
#[test]
fn tuple_pattern() {
let source = r#"
if (1, 2) {
is (1, x) -> x,
is _ -> 99
}
"#;
test_in_fresh_env!(source, 2);
}
#[test]
fn tuple_pattern_2() {
let source = r#"
if (1, 2) {
is (10, x) -> x,
is (y, x) -> x + y
}
"#;
test_in_fresh_env!(source, 3);
}
#[test]
fn tuple_pattern_3() {
let source = r#"
if (1, 5) {
is (10, x) -> x,
is (1, x) -> x
}
"#;
test_in_fresh_env!(source, 5);
}
#[test]
fn tuple_pattern_4() {
let source = r#"
if (1, 5) {
is (10, x) -> x,
is (1, x) -> x,
}
"#;
test_in_fresh_env!(source, 5);
}
#[test]
fn prim_obj_pattern() {
let source = r#"
type Stuff = Mulch(Nat) | Jugs(Nat, String) | Mardok
let a = Stuff::Mulch(20)
let b = Stuff::Jugs(1, "haha")
let c = Stuff::Mardok
let x = if a {
is Stuff::Mulch(20) -> "x",
is _ -> "ERR"
}
let y = if b {
is Stuff::Mulch(n) -> "ERR",
is Stuff::Jugs(2, _) -> "ERR",
is Stuff::Jugs(1, s) -> s,
is _ -> "ERR",
}
let z = if c {
is Stuff::Jugs(_, _) -> "ERR",
is Stuff::Mardok -> "NIGH",
is _ -> "ERR",
}
(x, y, z)
"#;
test_in_fresh_env!(source, r#"("x", "haha", "NIGH")"#);
}
#[test]
fn basic_lambda_syntax() {
let source = r#"
let q = \(x, y) { x * y }
let x = q(5,2)
let y = \(m, n, o) { m + n + o }(1,2,3)
(x, y)
"#;
test_in_fresh_env!(source, r"(10, 6)");
}
#[test]
fn lambda_syntax_2() {
let source = r#"
fn milta() {
\(x) { x + 33 }
}
milta()(10)
"#;
test_in_fresh_env!(source, "43");
}

View File

@@ -1,44 +0,0 @@
#![feature(trace_macros)]
#![feature(custom_attribute)]
//#![feature(unrestricted_attribute_tokens)]
#![feature(slice_patterns, box_patterns, box_syntax)]
//! `schala-lang` is where the Schala programming language is actually implemented.
//! It defines the `Schala` type, which contains the state for a Schala REPL, and implements
//! `ProgrammingLanguageInterface` and the chain of compiler passes for it.
extern crate itertools;
#[macro_use]
extern crate lazy_static;
#[macro_use]
extern crate maplit;
extern crate schala_repl;
#[macro_use]
extern crate schala_lang_codegen;
extern crate ena;
extern crate derivative;
macro_rules! bx {
($e:expr) => { Box::new($e) }
}
#[macro_use]
mod util;
#[macro_use]
mod typechecking;
mod debugging;
mod tokenizing;
mod ast;
mod parsing;
#[macro_use]
mod symbol_table;
mod scope_resolution;
mod builtin;
mod reduced_ast;
mod eval;
mod schala;
pub use schala::Schala;

File diff suppressed because it is too large Load Diff

View File

@@ -1,709 +0,0 @@
#![cfg(test)]
use ::std::rc::Rc;
use std::str::FromStr;
use super::tokenize;
use super::ParseResult;
use crate::ast::{ItemIdStore, AST, Meta, Expression, Statement, StatementKind, IfExpressionBody, Discriminator, Pattern, PatternLiteral, TypeBody, Enumerator, ForBody, InvocationArgument, FormalParam, PrefixOp, BinOp, QualifiedName};
use super::Declaration::*;
use super::Signature;
use super::TypeIdentifier::*;
use super::TypeSingletonName;
use super::ExpressionKind::*;
use super::Variant::*;
use super::ForBody::*;
fn parse(input: &str) -> ParseResult<AST> {
let tokens: Vec<crate::tokenizing::Token> = tokenize(input);
let mut parser = super::Parser::new(tokens);
parser.parse()
}
macro_rules! parse_test {
($string:expr, $correct:expr) => {
assert_eq!(parse($string).unwrap(), $correct)
};
}
macro_rules! parse_test_wrap_ast {
($string:expr, $correct:expr) => { parse_test!($string, AST { id: ItemIdStore::new_id(), statements: vec![$correct] }) }
}
macro_rules! parse_error {
($string:expr) => { assert!(parse($string).is_err()) }
}
macro_rules! qname {
( $( $component:expr),* ) => {
{
let mut components = vec![];
$(
components.push(rc!($component));
)*
QualifiedName { components, id: ItemIdStore::new_id() }
}
};
}
macro_rules! val {
($var:expr) => { Value(QualifiedName { components: vec![Rc::new($var.to_string())], id: ItemIdStore::new_id() }) };
}
macro_rules! ty {
($name:expr) => { Singleton(tys!($name)) }
}
macro_rules! tys {
($name:expr) => { TypeSingletonName { name: Rc::new($name.to_string()), params: vec![] } };
}
macro_rules! decl {
($expr_type:expr) => {
Statement { id: ItemIdStore::new_id(), kind: StatementKind::Declaration($expr_type) }
};
}
macro_rules! ex {
($expr_type:expr) => { Expression::new(ItemIdStore::new_id(), $expr_type) };
(m $expr_type:expr) => { Meta::new(Expression::new(ItemIdStore::new_id(), $expr_type)) };
(m $expr_type:expr, $type_anno:expr) => { Meta::new(Expression::with_anno(ItemIdStore::new_id(), $expr_type, $type_anno)) };
(s $expr_text:expr) => {
{
let tokens: Vec<crate::tokenizing::Token> = tokenize($expr_text);
let mut parser = super::Parser::new(tokens);
parser.expression().unwrap()
}
};
}
macro_rules! inv {
($expr_type:expr) => { InvocationArgument::Positional($expr_type) }
}
macro_rules! binexp {
($op:expr, $lhs:expr, $rhs:expr) => { BinExp(BinOp::from_sigil($op), bx!(Expression::new(ItemIdStore::new_id(), $lhs).into()), bx!(Expression::new(ItemIdStore::new_id(), $rhs).into())) }
}
macro_rules! prefexp {
($op:expr, $lhs:expr) => { PrefixExp(PrefixOp::from_str($op).unwrap(), bx!(Expression::new(ItemIdStore::new_id(), $lhs).into())) }
}
macro_rules! exst {
($expr_type:expr) => { Meta::new(Statement { id: ItemIdStore::new_id(), kind: StatementKind::Expression(Expression::new(ItemIdStore::new_id(), $expr_type).into())}) };
($expr_type:expr, $type_anno:expr) => { Meta::new(Statement { id: ItemIdStore::new_id(), kind: StatementKind::Expression(Expression::with_anno(ItemIdStore::new_id(), $expr_type, $type_anno).into())}) };
($op:expr, $lhs:expr, $rhs:expr) => { Meta::new(
Statement { id: ItemIdStore::new_id(), ,kind: StatementKind::Expression(ex!(binexp!($op, $lhs, $rhs)))}
)};
(s $statement_text:expr) => {
{
let tokens: Vec<crate::tokenizing::Token> = tokenize($statement_text);
let mut parser = super::Parser::new(tokens);
Meta::new(parser.statement().unwrap())
}
}
}
#[test]
fn parsing_number_literals_and_binexps() {
parse_test_wrap_ast! { ".2", exst!(FloatLiteral(0.2)) };
parse_test_wrap_ast! { "8.1", exst!(FloatLiteral(8.1)) };
parse_test_wrap_ast! { "0b010", exst!(NatLiteral(2)) };
parse_test_wrap_ast! { "0b0_1_0_", exst!(NatLiteral(2)) }
parse_test_wrap_ast! {"0xff", exst!(NatLiteral(255)) };
parse_test_wrap_ast! {"0xf_f_", exst!(NatLiteral(255)) };
parse_test_wrap_ast! {"0xf_f_+1", exst!(binexp!("+", NatLiteral(255), NatLiteral(1))) };
parse_test! {"3; 4; 4.3",
AST {
id: ItemIdStore::new_id(),
statements: vec![exst!(NatLiteral(3)), exst!(NatLiteral(4)),
exst!(FloatLiteral(4.3))]
}
};
parse_test_wrap_ast!("1 + 2 * 3",
exst!(binexp!("+", NatLiteral(1), binexp!("*", NatLiteral(2), NatLiteral(3))))
);
parse_test_wrap_ast!("1 * 2 + 3",
exst!(binexp!("+", binexp!("*", NatLiteral(1), NatLiteral(2)), NatLiteral(3)))
) ;
parse_test_wrap_ast!("1 && 2", exst!(binexp!("&&", NatLiteral(1), NatLiteral(2))));
parse_test_wrap_ast!("1 + 2 * 3 + 4", exst!(
binexp!("+",
binexp!("+", NatLiteral(1), binexp!("*", NatLiteral(2), NatLiteral(3))),
NatLiteral(4))));
parse_test_wrap_ast!("(1 + 2) * 3",
exst!(binexp!("*", binexp!("+", NatLiteral(1), NatLiteral(2)), NatLiteral(3))));
parse_test_wrap_ast!(".1 + .2", exst!(binexp!("+", FloatLiteral(0.1), FloatLiteral(0.2))));
parse_test_wrap_ast!("1 / 2", exst!(binexp!("/", NatLiteral(1), NatLiteral(2))));
}
#[test]
fn parsing_tuples() {
parse_test_wrap_ast!("()", exst!(TupleLiteral(vec![])));
parse_test_wrap_ast!("(\"hella\", 34)", exst!(
TupleLiteral(
vec![ex!(s r#""hella""#).into(), ex!(s "34").into()]
)
));
parse_test_wrap_ast!("((1+2), \"slough\")", exst!(TupleLiteral(vec![
ex!(binexp!("+", NatLiteral(1), NatLiteral(2))).into(),
ex!(StringLiteral(rc!(slough))).into(),
])))
}
#[test]
fn parsing_identifiers() {
parse_test_wrap_ast!("a", exst!(val!("a")));
parse_test_wrap_ast!("some_value", exst!(val!("some_value")));
parse_test_wrap_ast!("a + b", exst!(binexp!("+", val!("a"), val!("b"))));
//parse_test!("a[b]", AST(vec![Expression(
//parse_test!("a[]", <- TODO THIS NEEDS TO FAIL
//parse_test("a()[b]()[d]")
//TODO fix this parsing stuff
/*
parse_test! { "perspicacity()[a]", AST(vec![
exst!(Index {
indexee: bx!(ex!(Call { f: bx!(ex!(val!("perspicacity"))), arguments: vec![] })),
indexers: vec![ex!(val!("a"))]
})
])
}
*/
parse_test_wrap_ast!("a[b,c]", exst!(Index { indexee: bx!(ex!(m val!("a"))), indexers: vec![ex!(m val!("b")), ex!(m val!("c"))]} ));
parse_test_wrap_ast!("None", exst!(val!("None")));
parse_test_wrap_ast!("Pandas { a: x + y }",
exst!(NamedStruct { name: Meta::new(qname!(Pandas)), fields: vec![(rc!(a), ex!(m binexp!("+", val!("x"), val!("y"))))]})
);
parse_test_wrap_ast! { "Pandas { a: n, b: q, }",
exst!(NamedStruct { name: Meta::new(qname!(Pandas)), fields:
vec![(rc!(a), ex!(m val!("n"))), (rc!(b), ex!(m val!("q")))]
}
)
};
}
#[test]
fn qualified_identifiers() {
parse_test_wrap_ast! {
"let q_q = Yolo::Swaggins",
Meta::new(decl!(Binding { name: rc!(q_q), constant: true, type_anno: None,
expr: Meta::new(Expression::new(ItemIdStore::new_id(), Value(qname!(Yolo, Swaggins)))),
}))
}
parse_test_wrap_ast! {
"thing::item::call()",
exst!(Call { f: bx![ex!(m Value(qname!(thing, item, call)))], arguments: vec![] })
}
}
#[test]
fn reserved_words() {
parse_error!("module::item::call()");
}
#[test]
fn parsing_complicated_operators() {
parse_test_wrap_ast!("a <- b", exst!(binexp!("<-", val!("a"), val!("b"))));
parse_test_wrap_ast!("a || b", exst!(binexp!("||", val!("a"), val!("b"))));
parse_test_wrap_ast!("a<>b", exst!(binexp!("<>", val!("a"), val!("b"))));
parse_test_wrap_ast!("a.b.c.d", exst!(binexp!(".",
binexp!(".",
binexp!(".", val!("a"), val!("b")),
val!("c")),
val!("d"))));
parse_test_wrap_ast!("-3", exst!(prefexp!("-", NatLiteral(3))));
parse_test_wrap_ast!("-0.2", exst!(prefexp!("-", FloatLiteral(0.2))));
parse_test_wrap_ast!("!3", exst!(prefexp!("!", NatLiteral(3))));
parse_test_wrap_ast!("a <- -b", exst!(binexp!("<-", val!("a"), prefexp!("-", val!("b")))));
parse_test_wrap_ast!("a <--b", exst!(binexp!("<--", val!("a"), val!("b"))));
}
#[test]
fn parsing_functions() {
parse_test_wrap_ast!("fn oi()", Meta::new(decl!(FuncSig(Signature { name: rc!(oi), operator: false, params: vec![], type_anno: None }))));
parse_test_wrap_ast!("oi()", exst!(Call { f: bx!(ex!(m val!("oi"))), arguments: vec![] }));
parse_test_wrap_ast!("oi(a, 2 + 2)", exst!(Call
{ f: bx!(ex!(m val!("oi"))),
arguments: vec![inv!(ex!(m val!("a"))), inv!(ex!(m binexp!("+", NatLiteral(2), NatLiteral(2)))).into()]
}));
parse_error!("a(b,,c)");
parse_test_wrap_ast!("fn a(b, c: Int): Int", Meta::new(decl!(
FuncSig(Signature { name: rc!(a), operator: false, params: vec![
FormalParam { name: rc!(b), anno: None, default: None },
FormalParam { name: rc!(c), anno: Some(ty!("Int")), default: None }
], type_anno: Some(ty!("Int")) }))));
parse_test_wrap_ast!("fn a(x) { x() }", Meta::new(decl!(
FuncDecl(Signature { name: rc!(a), operator: false, params: vec![FormalParam { name: rc!(x), anno: None, default: None }], type_anno: None },
vec![exst!(Call { f: bx!(ex!(m val!("x"))), arguments: vec![] })]))));
parse_test_wrap_ast!("fn a(x) {\n x() }", Meta::new(decl!(
FuncDecl(Signature { name: rc!(a), operator: false, params: vec![FormalParam { name: rc!(x), anno: None, default: None }], type_anno: None },
vec![exst!(Call { f: bx!(ex!(m val!("x"))), arguments: vec![] })]))));
let multiline = r#"
fn a(x) {
x()
}
"#;
parse_test_wrap_ast!(multiline, Meta::new(decl!(
FuncDecl(Signature { name: rc!(a), operator: false, params: vec![FormalParam { name: rc!(x), default: None, anno: None }], type_anno: None },
vec![exst!(Call { f: bx!(ex!(m val!("x"))), arguments: vec![] })]))));
let multiline2 = r#"
fn a(x) {
x()
}
"#;
parse_test_wrap_ast!(multiline2, Meta::new(decl!(
FuncDecl(Signature { name: rc!(a), operator: false, params: vec![FormalParam { name: rc!(x), default: None, anno: None }], type_anno: None },
vec![exst!(s "x()")]))));
}
#[test]
fn functions_with_default_args() {
parse_test_wrap_ast! {
"fn func(x: Int, y: Int = 4) { }",
Meta::new(decl!(
FuncDecl(Signature { name: rc!(func), operator: false, type_anno: None, params: vec![
FormalParam { name: rc!(x), default: None, anno: Some(ty!("Int")) },
FormalParam { name: rc!(y), default: Some(Meta::new(ex!(s "4"))), anno: Some(ty!("Int")) }
]}, vec![])
))
};
}
#[test]
fn parsing_bools() {
parse_test_wrap_ast!("false", exst!(BoolLiteral(false)));
parse_test_wrap_ast!("true", exst!(BoolLiteral(true)));
}
#[test]
fn parsing_strings() {
parse_test_wrap_ast!(r#""hello""#, exst!(StringLiteral(rc!(hello))));
}
#[test]
fn parsing_types() {
parse_test_wrap_ast!("type Yolo = Yolo", Meta::new(decl!(TypeDecl { name: tys!("Yolo"), body: TypeBody(vec![UnitStruct(rc!(Yolo))]), mutable: false} )));
parse_test_wrap_ast!("type mut Yolo = Yolo", Meta::new(decl!(TypeDecl { name: tys!("Yolo"), body: TypeBody(vec![UnitStruct(rc!(Yolo))]), mutable: true} )));
parse_test_wrap_ast!("type alias Sex = Drugs", Meta::new(decl!(TypeAlias(rc!(Sex), rc!(Drugs)))));
parse_test_wrap_ast!("type Sanchez = Miguel | Alejandro(Int, Option<a>) | Esperanza { a: Int, b: String }",
Meta::new(decl!(TypeDecl {
name: tys!("Sanchez"),
body: TypeBody(vec![
UnitStruct(rc!(Miguel)),
TupleStruct(rc!(Alejandro), vec![
Singleton(TypeSingletonName { name: rc!(Int), params: vec![] }),
Singleton(TypeSingletonName { name: rc!(Option), params: vec![Singleton(TypeSingletonName { name: rc!(a), params: vec![] })] }),
]),
Record{
name: rc!(Esperanza),
members: vec![
(rc!(a), Singleton(TypeSingletonName { name: rc!(Int), params: vec![] })),
(rc!(b), Singleton(TypeSingletonName { name: rc!(String), params: vec![] })),
]
}
]),
mutable: false
})));
parse_test_wrap_ast! {
"type Jorge<a> = Diego | Kike(a)",
Meta::new(decl!(TypeDecl{
name: TypeSingletonName { name: rc!(Jorge), params: vec![Singleton(TypeSingletonName { name: rc!(a), params: vec![] })] },
body: TypeBody(vec![UnitStruct(rc!(Diego)), TupleStruct(rc!(Kike), vec![Singleton(TypeSingletonName { name: rc!(a), params: vec![] })])]),
mutable: false
}
))
};
}
#[test]
fn parsing_bindings() {
parse_test_wrap_ast!("let mut a = 10", Meta::new(decl!(Binding { name: rc!(a), constant: false, type_anno: None, expr: ex!(m NatLiteral(10)) } )));
parse_test_wrap_ast!("let a = 2 + 2", Meta::new(decl!(Binding { name: rc!(a), constant: true, type_anno: None, expr: ex!(m binexp!("+", NatLiteral(2), NatLiteral(2))) }) ));
parse_test_wrap_ast!("let a: Nat = 2 + 2", Meta::new(decl!(
Binding { name: rc!(a), constant: true, type_anno: Some(Singleton(TypeSingletonName { name: rc!(Nat), params: vec![] })),
expr: Meta::new(ex!(binexp!("+", NatLiteral(2), NatLiteral(2)))) }
)));
}
#[test]
fn parsing_block_expressions() {
parse_test_wrap_ast! {
"if a() then { b(); c() }", exst!(
IfExpression {
discriminator: bx! {
Discriminator::Simple(ex!(m Call { f: bx!(ex!(m val!("a"))), arguments: vec![]}))
},
body: bx! {
IfExpressionBody::SimpleConditional(
vec![exst!(Call { f: bx!(ex!(m val!("b"))), arguments: vec![]}), exst!(Call { f: bx!(ex!(m val!("c"))), arguments: vec![] })],
None
)
}
}
)
};
parse_test_wrap_ast! {
"if a() then { b(); c() } else { q }", exst!(
IfExpression {
discriminator: bx! {
Discriminator::Simple(ex!(m Call { f: bx!(ex!(m val!("a"))), arguments: vec![]}))
},
body: bx! {
IfExpressionBody::SimpleConditional(
vec![exst!(Call { f: bx!(ex!(m val!("b"))), arguments: vec![]}), exst!(Call { f: bx!(ex!(m val!("c"))), arguments: vec![] })],
Some(
vec![exst!(val!("q"))],
)
)
}
}
)
};
/*
parse_test!("if a() then { b(); c() }", AST(vec![exst!(
IfExpression(bx!(ex!(Call { f: bx!(ex!(val!("a"))), arguments: vec![]})),
vec![exst!(Call { f: bx!(ex!(val!("b"))), arguments: vec![]}), exst!(Call { f: bx!(ex!(val!("c"))), arguments: vec![] })],
None)
)]));
parse_test!(r#"
if true then {
const a = 10
b
} else {
c
}"#,
AST(vec![exst!(IfExpression(bx!(ex!(BoolLiteral(true))),
vec![decl!(Binding { name: rc!(a), constant: true, expr: ex!(NatLiteral(10)) }),
exst!(val!(rc!(b)))],
Some(vec![exst!(val!(rc!(c)))])))])
);
parse_test!("if a { b } else { c }", AST(vec![exst!(
IfExpression(bx!(ex!(val!("a"))),
vec![exst!(val!("b"))],
Some(vec![exst!(val!("c"))])))]));
parse_test!("if (A {a: 1}) { b } else { c }", AST(vec![exst!(
IfExpression(bx!(ex!(NamedStruct { name: rc!(A), fields: vec![(rc!(a), ex!(NatLiteral(1)))]})),
vec![exst!(val!("b"))],
Some(vec![exst!(val!("c"))])))]));
parse_error!("if A {a: 1} { b } else { c }");
*/
}
#[test]
fn parsing_interfaces() {
parse_test_wrap_ast!("interface Unglueable { fn unglue(a: Glue); fn mar(): Glue }",
Meta::new(decl!(Interface {
name: rc!(Unglueable),
signatures: vec![
Signature {
name: rc!(unglue),
operator: false,
params: vec![
FormalParam { name: rc!(a), anno: Some(Singleton(TypeSingletonName { name: rc!(Glue), params: vec![] })), default: None }
],
type_anno: None
},
Signature { name: rc!(mar), operator: false, params: vec![], type_anno: Some(Singleton(TypeSingletonName { name: rc!(Glue), params: vec![] })) },
]
}))
);
}
#[test]
fn parsing_impls() {
parse_test_wrap_ast!("impl Heh { fn yolo(); fn swagg(); }",
Meta::new(
decl!(Impl {
type_name: ty!("Heh"),
interface_name: None,
block: vec![
FuncSig(Signature { name: rc!(yolo), operator: false, params: vec![], type_anno: None }),
FuncSig(Signature { name: rc!(swagg), operator: false, params: vec![], type_anno: None })
] })));
parse_test_wrap_ast!("impl Mondai for Lollerino { fn yolo(); fn swagg(); }",
Meta::new(decl!(Impl {
type_name: ty!("Lollerino"),
interface_name: Some(TypeSingletonName { name: rc!(Mondai), params: vec![] }),
block: vec![
FuncSig(Signature { name: rc!(yolo), operator: false, params: vec![], type_anno: None}),
FuncSig(Signature { name: rc!(swagg), operator: false, params: vec![], type_anno: None })
] })));
parse_test_wrap_ast!("impl Hella<T> for (Alpha, Omega) { }",
Meta::new(decl!(Impl {
type_name: Tuple(vec![ty!("Alpha"), ty!("Omega")]),
interface_name: Some(TypeSingletonName { name: rc!(Hella), params: vec![ty!("T")] }),
block: vec![]
}))
);
parse_test_wrap_ast!("impl Option<WTFMate> { fn oi() }",
Meta::new(
decl!(Impl {
type_name: Singleton(TypeSingletonName { name: rc!(Option), params: vec![ty!("WTFMate")]}),
interface_name: None,
block: vec![
FuncSig(Signature { name: rc!(oi), operator: false, params: vec![], type_anno: None }),
]
})));
}
#[test]
fn parsing_type_annotations() {
parse_test_wrap_ast!("let a = b : Int",
Meta::new(
decl!(Binding { name: rc!(a), constant: true, type_anno: None, expr:
ex!(m val!("b"), ty!("Int")) })));
parse_test_wrap_ast!("a : Int",
exst!(val!("a"), ty!("Int"))
);
parse_test_wrap_ast!("a : Option<Int>",
exst!(val!("a"), Singleton(TypeSingletonName { name: rc!(Option), params: vec![ty!("Int")] }))
);
parse_test_wrap_ast!("a : KoreanBBQSpecifier<Kimchi, Option<Bulgogi> >",
exst!(val!("a"), Singleton(TypeSingletonName { name: rc!(KoreanBBQSpecifier), params: vec![
ty!("Kimchi"), Singleton(TypeSingletonName { name: rc!(Option), params: vec![ty!("Bulgogi")] })
] }))
);
parse_test_wrap_ast!("a : (Int, Yolo<a>)",
exst!(val!("a"), Tuple(
vec![ty!("Int"), Singleton(TypeSingletonName {
name: rc!(Yolo), params: vec![ty!("a")]
})])));
}
#[test]
fn parsing_lambdas() {
parse_test_wrap_ast! { r#"\(x) { x + 1}"#, exst!(
Lambda { params: vec![FormalParam { name: rc!(x), anno: None, default: None } ], type_anno: None, body: vec![exst!(s "x + 1")] }
)
}
parse_test_wrap_ast!(r#"\ (x: Int, y) { a;b;c;}"#,
exst!(Lambda {
params: vec![
FormalParam { name: rc!(x), anno: Some(ty!("Int")), default: None },
FormalParam { name: rc!(y), anno: None, default: None }
],
type_anno: None,
body: vec![exst!(s "a"), exst!(s "b"), exst!(s "c")]
})
);
parse_test_wrap_ast! { r#"\(x){y}(1)"#,
exst!(Call { f: bx!(ex!(m
Lambda {
params: vec![
FormalParam { name: rc!(x), anno: None, default: None }
],
type_anno: None,
body: vec![exst!(s "y")] }
)),
arguments: vec![inv!(ex!(m NatLiteral(1))).into()] })
};
parse_test_wrap_ast! {
r#"\(x: Int): String { "q" }"#,
exst!(Lambda {
params: vec![
FormalParam { name: rc!(x), anno: Some(ty!("Int")), default: None },
],
type_anno: Some(ty!("String")),
body: vec![exst!(s r#""q""#)]
})
}
}
#[test]
fn single_param_lambda() {
parse_test_wrap_ast! {
r"\x { x + 10 }",
exst!(Lambda {
params: vec![FormalParam { name: rc!(x), anno: None, default: None }],
type_anno: None,
body: vec![exst!(s r"x + 10")]
})
}
parse_test_wrap_ast! {
r"\x: Nat { x + 10 }",
exst!(Lambda {
params: vec![FormalParam { name: rc!(x), anno: Some(ty!("Nat")), default: None }],
type_anno: None,
body: vec![exst!(s r"x + 10")]
})
}
}
#[test]
fn more_advanced_lambdas() {
parse_test! {
r#"fn wahoo() { let a = 10; \(x) { x + a } };
wahoo()(3) "#,
AST {
id: ItemIdStore::new_id(),
statements: vec![
exst!(s r"fn wahoo() { let a = 10; \(x) { x + a } }"),
exst! {
Call {
f: bx!(ex!(m Call { f: bx!(ex!(m val!("wahoo"))), arguments: vec![] })),
arguments: vec![inv!(ex!(m NatLiteral(3))).into()],
}
}
]
}
}
}
#[test]
fn list_literals() {
parse_test_wrap_ast! {
"[1,2]",
exst!(ListLiteral(vec![ex!(m NatLiteral(1)), ex!(m NatLiteral(2))]))
};
}
#[test]
fn while_expr() {
parse_test_wrap_ast! {
"while { }",
exst!(WhileExpression { condition: None, body: vec![] })
}
parse_test_wrap_ast! {
"while a == b { }",
exst!(WhileExpression { condition: Some(bx![ex![m binexp!("==", val!("a"), val!("b"))]]), body: vec![] })
}
}
#[test]
fn for_expr() {
parse_test_wrap_ast! {
"for { a <- maybeValue } return 1",
exst!(ForExpression {
enumerators: vec![Enumerator { id: rc!(a), generator: ex!(val!("maybeValue")) }],
body: bx!(MonadicReturn(ex!(s "1")))
})
}
parse_test_wrap_ast! {
"for n <- someRange { f(n); }",
exst!(ForExpression { enumerators: vec![Enumerator { id: rc!(n), generator: ex!(val!("someRange"))}],
body: bx!(ForBody::StatementBlock(vec![exst!(s "f(n)")]))
})
}
}
#[test]
fn patterns() {
parse_test_wrap_ast! {
"if x is Some(a) then { 4 } else { 9 }", exst!(
IfExpression {
discriminator: bx!(Discriminator::Simple(Meta::new(ex!(s "x")))),
body: bx!(IfExpressionBody::SimplePatternMatch(Pattern::TupleStruct(qname!(Some),
vec![Pattern::VarOrName(qname!(a))]), vec![exst!(s "4")], Some(vec![exst!(s "9")]))) }
)
}
parse_test_wrap_ast! {
"if x is Some(a) then 4 else 9", exst!(
IfExpression {
discriminator: bx!(Discriminator::Simple(Meta::new(ex!(s "x")))),
body: bx!(IfExpressionBody::SimplePatternMatch(Pattern::TupleStruct(qname!(Some),
vec![Pattern::VarOrName(qname!(a))]), vec![exst!(s "4")], Some(vec![exst!(s "9")]))) }
)
}
parse_test_wrap_ast! {
"if x is Something { a, b: x } then { 4 } else { 9 }", exst!(
IfExpression {
discriminator: bx!(Discriminator::Simple(Meta::new(ex!(s "x")))),
body: bx!(IfExpressionBody::SimplePatternMatch(
Pattern::Record(qname!(Something), vec![
(rc!(a),Pattern::Literal(PatternLiteral::StringPattern(rc!(a)))),
(rc!(b),Pattern::VarOrName(qname!(x)))
]),
vec![exst!(s "4")], Some(vec![exst!(s "9")])))
}
)
}
}
#[test]
fn pattern_literals() {
parse_test_wrap_ast! {
"if x is -1 then 1 else 2",
exst!(
IfExpression {
discriminator: bx!(Discriminator::Simple(Meta::new(ex!(s "x")))),
body: bx!(IfExpressionBody::SimplePatternMatch(
Pattern::Literal(PatternLiteral::NumPattern { neg: true, num: NatLiteral(1) }),
vec![exst!(NatLiteral(1))],
Some(vec![exst!(NatLiteral(2))]),
))
}
)
}
parse_test_wrap_ast! {
"if x is 1 then 1 else 2",
exst!(
IfExpression {
discriminator: bx!(Discriminator::Simple(Meta::new(ex!(s "x")))),
body: bx!(IfExpressionBody::SimplePatternMatch(
Pattern::Literal(PatternLiteral::NumPattern { neg: false, num: NatLiteral(1) }),
vec![exst!(s "1")],
Some(vec![exst!(s "2")]),
))
}
)
}
parse_test_wrap_ast! {
"if x is true then 1 else 2",
exst!(
IfExpression {
discriminator: bx!(Discriminator::Simple(Meta::new(ex!(s "x")))),
body: bx!(IfExpressionBody::SimplePatternMatch(
Pattern::Literal(PatternLiteral::BoolPattern(true)),
vec![exst!(NatLiteral(1))],
Some(vec![exst!(NatLiteral(2))]),
))
}
)
}
parse_test_wrap_ast! {
"if x is \"gnosticism\" then 1 else 2",
exst!(
IfExpression {
discriminator: bx!(Discriminator::Simple(Meta::new(ex!(s "x")))),
body: bx!(IfExpressionBody::SimplePatternMatch(
Pattern::Literal(PatternLiteral::StringPattern(rc!(gnosticism))),
vec![exst!(s "1")],
Some(vec![exst!(s "2")]),
))
}
)
}
}

View File

@@ -1,14 +0,0 @@
type Option<T> = Some(T) | None
type Ord = LT | EQ | GT
fn map(input: Option<T>, func: Func): Option<T> {
if input {
is Option::Some(x) -> Option::Some(func(x)),
is Option::None -> Option::None,
}
}
type Complicated = Sunrise | Metal { black: bool, norwegian: bool } | Fella(String, Int)

View File

@@ -1,516 +0,0 @@
//! # Reduced AST
//! The reduced AST is a minimal AST designed to be built from the full AST after all possible
//! static checks have been done. Consequently, the AST reduction phase does very little error
//! checking itself - any errors should ideally be caught either by an earlier phase, or are
//! runtime errors that the evaluator should handle. That said, becuase it does do table lookups
//! that can in principle fail [especially at the moment with most static analysis not yet complete],
//! there is an Expr variant `ReductionError` to handle these cases.
//!
//! A design decision to make - should the ReducedAST types contain all information about
//! type/layout necessary for the evaluator to work? If so, then the evaluator should not
//! have access to the symbol table at all and ReducedAST should carry that information. If not,
//! then ReducedAST shouldn't be duplicating information that can be queried at runtime from the
//! symbol table. But I think the former might make sense since ultimately the bytecode will be
//! built from the ReducedAST.
use std::rc::Rc;
use std::str::FromStr;
use crate::ast::*;
use crate::symbol_table::{Symbol, SymbolSpec, SymbolTable, FullyQualifiedSymbolName};
use crate::builtin::Builtin;
#[derive(Debug)]
pub struct ReducedAST(pub Vec<Stmt>);
#[derive(Debug, Clone)]
pub enum Stmt {
PreBinding {
name: Rc<String>,
func: Func,
},
Binding {
name: Rc<String>,
constant: bool,
expr: Expr,
},
Expr(Expr),
Noop,
}
#[derive(Debug, Clone)]
pub enum Expr {
Unit,
Lit(Lit),
Tuple(Vec<Expr>),
Func(Func),
Sym(Rc<String>),
Constructor {
type_name: Rc<String>,
name: Rc<String>,
tag: usize,
arity: usize, // n.b. arity here is always the value from the symbol table - if it doesn't match what it's being called with, that's an eval error, eval will handle it
},
Call {
f: Box<Expr>,
args: Vec<Expr>,
},
Assign {
val: Box<Expr>,
expr: Box<Expr>,
},
Conditional {
cond: Box<Expr>,
then_clause: Vec<Stmt>,
else_clause: Vec<Stmt>,
},
ConditionalTargetSigilValue,
CaseMatch {
cond: Box<Expr>,
alternatives: Vec<Alternative>
},
UnimplementedSigilValue,
ReductionError(String),
}
pub type BoundVars = Vec<Option<Rc<String>>>; //remember that order matters here
#[derive(Debug, Clone)]
pub struct Alternative {
pub matchable: Subpattern,
pub item: Vec<Stmt>,
}
#[derive(Debug, Clone)]
pub struct Subpattern {
pub tag: Option<usize>,
pub subpatterns: Vec<Option<Subpattern>>,
pub bound_vars: BoundVars,
pub guard: Option<Expr>,
}
#[derive(Debug, Clone)]
pub enum Lit {
Nat(u64),
Int(i64),
Float(f64),
Bool(bool),
StringLit(Rc<String>),
}
#[derive(Debug, Clone)]
pub enum Func {
BuiltIn(Builtin),
UserDefined {
name: Option<Rc<String>>,
params: Vec<Rc<String>>,
body: Vec<Stmt>,
}
}
pub fn reduce(ast: &AST, symbol_table: &SymbolTable) -> ReducedAST {
let mut reducer = Reducer { symbol_table };
reducer.ast(ast)
}
struct Reducer<'a> {
symbol_table: &'a SymbolTable
}
impl<'a> Reducer<'a> {
fn ast(&mut self, ast: &AST) -> ReducedAST {
let mut output = vec![];
for statement in ast.statements.iter() {
output.push(self.statement(statement));
}
ReducedAST(output)
}
fn statement(&mut self, stmt: &Statement) -> Stmt {
match &stmt.kind {
StatementKind::Expression(expr) => Stmt::Expr(self.expression(&expr)),
StatementKind::Declaration(decl) => self.declaration(&decl),
}
}
fn block(&mut self, block: &Block) -> Vec<Stmt> {
block.iter().map(|stmt| self.statement(stmt)).collect()
}
fn invocation_argument(&mut self, invoc: &InvocationArgument) -> Expr {
use crate::ast::InvocationArgument::*;
match invoc {
Positional(ex) => self.expression(ex),
Keyword { .. } => Expr::UnimplementedSigilValue,
Ignored => Expr::UnimplementedSigilValue,
}
}
fn expression(&mut self, expr: &Expression) -> Expr {
use crate::ast::ExpressionKind::*;
let symbol_table = self.symbol_table;
let ref input = expr.kind;
match input {
NatLiteral(n) => Expr::Lit(Lit::Nat(*n)),
FloatLiteral(f) => Expr::Lit(Lit::Float(*f)),
StringLiteral(s) => Expr::Lit(Lit::StringLit(s.clone())),
BoolLiteral(b) => Expr::Lit(Lit::Bool(*b)),
BinExp(binop, lhs, rhs) => self.binop(binop, lhs, rhs),
PrefixExp(op, arg) => self.prefix(op, arg),
Value(qualified_name) => {
let ref id = qualified_name.id;
let ref sym_name = match symbol_table.get_fqsn_from_id(id) {
Some(fqsn) => fqsn,
None => return Expr::ReductionError(format!("FQSN lookup for Value {:?} failed", qualified_name)),
};
//TODO this probably needs to change
let FullyQualifiedSymbolName(ref v) = sym_name;
let name = v.last().unwrap().name.clone();
match symbol_table.lookup_by_fqsn(&sym_name) {
Some(Symbol { spec: SymbolSpec::DataConstructor { index, type_args, type_name}, .. }) => Expr::Constructor {
type_name: type_name.clone(),
name: name.clone(),
tag: index.clone(),
arity: type_args.len(),
},
_ => Expr::Sym(name.clone()),
}
},
Call { f, arguments } => self.reduce_call_expression(f, arguments),
TupleLiteral(exprs) => Expr::Tuple(exprs.iter().map(|e| self.expression(e)).collect()),
IfExpression { discriminator, body } => self.reduce_if_expression(discriminator, body),
Lambda { params, body, .. } => self.reduce_lambda(params, body),
NamedStruct { name, fields } => self.reduce_named_struct(name, fields),
Index { .. } => Expr::UnimplementedSigilValue,
WhileExpression { .. } => Expr::UnimplementedSigilValue,
ForExpression { .. } => Expr::UnimplementedSigilValue,
ListLiteral { .. } => Expr::UnimplementedSigilValue,
}
}
fn reduce_lambda(&mut self, params: &Vec<FormalParam>, body: &Block) -> Expr {
Expr::Func(Func::UserDefined {
name: None,
params: params.iter().map(|param| param.name.clone()).collect(),
body: self.block(body),
})
}
fn reduce_named_struct(&mut self, name: &QualifiedName, fields: &Vec<(Rc<String>, Expression)>) -> Expr {
let symbol_table = self.symbol_table;
let ref sym_name = match symbol_table.get_fqsn_from_id(&name.id) {
Some(fqsn) => fqsn,
None => return Expr::ReductionError(format!("FQSN lookup for name {:?} failed", name)),
};
let FullyQualifiedSymbolName(ref v) = sym_name;
let ref name = v.last().unwrap().name;
let (type_name, index, members_from_table) = match symbol_table.lookup_by_fqsn(&sym_name) {
Some(Symbol { spec: SymbolSpec::RecordConstructor { members, type_name, index }, .. }) => (type_name.clone(), index, members),
_ => return Expr::ReductionError("Not a record constructor".to_string()),
};
let arity = members_from_table.len();
let mut args: Vec<(Rc<String>, Expr)> = fields.iter()
.map(|(name, expr)| (name.clone(), self.expression(expr)))
.collect();
args.as_mut_slice()
.sort_unstable_by(|(name1, _), (name2, _)| name1.cmp(name2)); //arbitrary - sorting by alphabetical order
let args = args.into_iter().map(|(_, expr)| expr).collect();
//TODO make sure this sorting actually works
let f = box Expr::Constructor { type_name, name: name.clone(), tag: *index, arity, };
Expr::Call { f, args }
}
fn reduce_call_expression(&mut self, func: &Expression, arguments: &Vec<InvocationArgument>) -> Expr {
Expr::Call {
f: Box::new(self.expression(func)),
args: arguments.iter().map(|arg| self.invocation_argument(arg)).collect(),
}
}
fn reduce_if_expression(&mut self, discriminator: &Discriminator, body: &IfExpressionBody) -> Expr {
let symbol_table = self.symbol_table;
let cond = Box::new(match *discriminator {
Discriminator::Simple(ref expr) => self.expression(expr),
Discriminator::BinOp(ref _expr, ref _binop) => panic!("Can't yet handle binop discriminators")
});
match *body {
IfExpressionBody::SimpleConditional(ref then_clause, ref else_clause) => {
let then_clause = self.block(then_clause);
let else_clause = match else_clause {
None => vec![],
Some(stmts) => self.block(stmts),
};
Expr::Conditional { cond, then_clause, else_clause }
},
IfExpressionBody::SimplePatternMatch(ref pat, ref then_clause, ref else_clause) => {
let then_clause = self.block(then_clause);
let else_clause = match else_clause {
None => vec![],
Some(stmts) => self.block(stmts),
};
let alternatives = vec![
pat.to_alternative(then_clause, symbol_table),
Alternative {
matchable: Subpattern {
tag: None,
subpatterns: vec![],
bound_vars: vec![],
guard: None,
},
item: else_clause
},
];
Expr::CaseMatch {
cond,
alternatives,
}
},
IfExpressionBody::GuardList(ref guard_arms) => {
let mut alternatives = vec![];
for arm in guard_arms {
match arm.guard {
Guard::Pat(ref p) => {
let item = self.block(&arm.body);
let alt = p.to_alternative(item, symbol_table);
alternatives.push(alt);
},
Guard::HalfExpr(HalfExpr { op: _, expr: _ }) => {
return Expr::UnimplementedSigilValue
}
}
}
Expr::CaseMatch { cond, alternatives }
}
}
}
fn binop(&mut self, binop: &BinOp, lhs: &Box<Expression>, rhs: &Box<Expression>) -> Expr {
let operation = Builtin::from_str(binop.sigil()).ok();
match operation {
Some(Builtin::Assignment) => Expr::Assign {
val: Box::new(self.expression(&*lhs)),
expr: Box::new(self.expression(&*rhs)),
},
Some(op) => {
let f = Box::new(Expr::Func(Func::BuiltIn(op)));
Expr::Call { f, args: vec![self.expression(&*lhs), self.expression(&*rhs)] }
},
None => {
//TODO handle a user-defined operation
Expr::UnimplementedSigilValue
}
}
}
fn prefix(&mut self, prefix: &PrefixOp, arg: &Box<Expression>) -> Expr {
match prefix.builtin {
Some(op) => {
let f = Box::new(Expr::Func(Func::BuiltIn(op)));
Expr::Call { f, args: vec![self.expression(arg)] }
},
None => { //TODO need this for custom prefix ops
Expr::UnimplementedSigilValue
}
}
}
fn declaration(&mut self, declaration: &Declaration) -> Stmt {
use self::Declaration::*;
match declaration {
Binding {name, constant, expr, .. } => Stmt::Binding { name: name.clone(), constant: *constant, expr: self.expression(expr) },
FuncDecl(Signature { name, params, .. }, statements) => Stmt::PreBinding {
name: name.clone(),
func: Func::UserDefined {
name: Some(name.clone()),
params: params.iter().map(|param| param.name.clone()).collect(),
body: self.block(&statements),
}
},
TypeDecl { .. } => Stmt::Noop,
TypeAlias(_, _) => Stmt::Noop,
Interface { .. } => Stmt::Noop,
Impl { .. } => Stmt::Expr(Expr::UnimplementedSigilValue),
_ => Stmt::Expr(Expr::UnimplementedSigilValue)
}
}
}
/* ig var pat
* x is SomeBigOldEnum(_, x, Some(t))
*/
fn handle_symbol(symbol: Option<&Symbol>, inner_patterns: &Vec<Pattern>, symbol_table: &SymbolTable) -> Subpattern {
use self::Pattern::*;
let tag = symbol.map(|symbol| match symbol.spec {
SymbolSpec::DataConstructor { index, .. } => index.clone(),
_ => panic!("Symbol is not a data constructor - this should've been caught in type-checking"),
});
let bound_vars = inner_patterns.iter().map(|p| match p {
VarOrName(qualified_name) => {
let fqsn = symbol_table.get_fqsn_from_id(&qualified_name.id);
let symbol_exists = fqsn.and_then(|fqsn| symbol_table.lookup_by_fqsn(&fqsn)).is_some();
if symbol_exists {
None
} else {
let QualifiedName { components, .. } = qualified_name;
if components.len() == 1 {
Some(components[0].clone())
} else {
panic!("Bad variable name in pattern");
}
}
},
_ => None,
}).collect();
let subpatterns = inner_patterns.iter().map(|p| match p {
Ignored => None,
VarOrName(_) => None,
Literal(other) => Some(other.to_subpattern(symbol_table)),
tp @ TuplePattern(_) => Some(tp.to_subpattern(symbol_table)),
ts @ TupleStruct(_, _) => Some(ts.to_subpattern(symbol_table)),
Record(..) => unimplemented!(),
}).collect();
let guard = None;
/*
let guard_equality_exprs: Vec<Expr> = subpatterns.iter().map(|p| match p {
Literal(lit) => match lit {
_ => unimplemented!()
},
_ => unimplemented!()
}).collect();
*/
Subpattern {
tag,
subpatterns,
guard,
bound_vars,
}
}
impl Pattern {
fn to_alternative(&self, item: Vec<Stmt>, symbol_table: &SymbolTable) -> Alternative {
let s = self.to_subpattern(symbol_table);
Alternative {
matchable: Subpattern {
tag: s.tag,
subpatterns: s.subpatterns,
bound_vars: s.bound_vars,
guard: s.guard,
},
item
}
}
fn to_subpattern(&self, symbol_table: &SymbolTable) -> Subpattern {
use self::Pattern::*;
match self {
TupleStruct(QualifiedName{ components, id }, inner_patterns) => {
let fqsn = symbol_table.get_fqsn_from_id(&id);
match fqsn.and_then(|fqsn| symbol_table.lookup_by_fqsn(&fqsn)) {
Some(symbol) => handle_symbol(Some(symbol), inner_patterns, symbol_table),
None => {
panic!("Symbol {:?} not found", components);
}
}
},
TuplePattern(inner_patterns) => handle_symbol(None, inner_patterns, symbol_table),
Record(_name, _pairs) => {
unimplemented!()
},
Ignored => Subpattern { tag: None, subpatterns: vec![], guard: None, bound_vars: vec![] },
Literal(lit) => lit.to_subpattern(symbol_table),
VarOrName(QualifiedName { components, id }) => {
// if fqsn is Some, treat this as a symbol pattern. If it's None, treat it
// as a variable.
println!("Calling VarOrName reduction with : {:?}", components);
let fqsn = symbol_table.get_fqsn_from_id(&id);
match fqsn.and_then(|fqsn| symbol_table.lookup_by_fqsn(&fqsn)) {
Some(symbol) => handle_symbol(Some(symbol), &vec![], symbol_table),
None => {
let name = if components.len() == 1 {
components[0].clone()
} else {
panic!("check this line of code yo");
};
Subpattern {
tag: None,
subpatterns: vec![],
guard: None,
bound_vars: vec![Some(name.clone())],
}
}
}
},
}
}
}
impl PatternLiteral {
fn to_subpattern(&self, _symbol_table: &SymbolTable) -> Subpattern {
use self::PatternLiteral::*;
match self {
NumPattern { neg, num } => {
let comparison = Expr::Lit(match (neg, num) {
(false, ExpressionKind::NatLiteral(n)) => Lit::Nat(*n),
(false, ExpressionKind::FloatLiteral(f)) => Lit::Float(*f),
(true, ExpressionKind::NatLiteral(n)) => Lit::Int(-1*(*n as i64)),
(true, ExpressionKind::FloatLiteral(f)) => Lit::Float(-1.0*f),
_ => panic!("This should never happen")
});
let guard = Some(Expr::Call {
f: Box::new(Expr::Func(Func::BuiltIn(Builtin::Equality))),
args: vec![comparison, Expr::ConditionalTargetSigilValue],
});
Subpattern {
tag: None,
subpatterns: vec![],
guard,
bound_vars: vec![],
}
},
StringPattern(s) => {
let guard = Some(Expr::Call {
f: Box::new(Expr::Func(Func::BuiltIn(Builtin::Equality))),
args: vec![Expr::Lit(Lit::StringLit(s.clone())), Expr::ConditionalTargetSigilValue]
});
Subpattern {
tag: None,
subpatterns: vec![],
guard,
bound_vars: vec![],
}
},
BoolPattern(b) => {
let guard = Some(if *b {
Expr::ConditionalTargetSigilValue
} else {
Expr::Call {
f: Box::new(Expr::Func(Func::BuiltIn(Builtin::BooleanNot))),
args: vec![Expr::ConditionalTargetSigilValue]
}
});
Subpattern {
tag: None,
subpatterns: vec![],
guard,
bound_vars: vec![],
}
},
}
}
}

View File

@@ -1,322 +0,0 @@
use stopwatch::Stopwatch;
use std::time::Duration;
use std::cell::RefCell;
use std::rc::Rc;
use std::collections::HashSet;
use itertools::Itertools;
use schala_repl::{ProgrammingLanguageInterface,
ComputationRequest, ComputationResponse,
LangMetaRequest, LangMetaResponse, GlobalOutputStats,
DebugResponse, DebugAsk};
use crate::{ast, reduced_ast, tokenizing, parsing, eval, typechecking, symbol_table};
/// All the state necessary to parse and execute a Schala program are stored in this struct.
/// `state` represents the execution state for the AST-walking interpreter, the other fields
/// should be self-explanatory.
pub struct Schala {
source_reference: SourceReference,
state: eval::State<'static>,
symbol_table: Rc<RefCell<symbol_table::SymbolTable>>,
type_context: typechecking::TypeContext<'static>,
active_parser: Option<parsing::Parser>,
}
impl Schala {
fn handle_docs(&self, source: String) -> LangMetaResponse {
LangMetaResponse::Docs {
doc_string: format!("Schala item `{}` : <<Schala-lang documentation not yet implemented>>", source)
}
}
}
impl Schala {
/// Creates a new Schala environment *without* any prelude.
fn new_blank_env() -> Schala {
let symbols = Rc::new(RefCell::new(symbol_table::SymbolTable::new()));
Schala {
source_reference: SourceReference::new(),
symbol_table: symbols.clone(),
state: eval::State::new(symbols),
type_context: typechecking::TypeContext::new(),
active_parser: None,
}
}
/// Creates a new Schala environment with the standard prelude, which is defined as ordinary
/// Schala code in the file `prelude.schala`
pub fn new() -> Schala {
let prelude = include_str!("prelude.schala");
let mut s = Schala::new_blank_env();
let request = ComputationRequest { source: prelude, debug_requests: HashSet::default() };
s.run_computation(request);
s
}
fn handle_debug_immediate(&self, request: DebugAsk) -> DebugResponse {
use DebugAsk::*;
match request {
Timing => DebugResponse { ask: Timing, value: format!("Invalid") },
ByStage { stage_name, token } => match &stage_name[..] {
"symbol-table" => {
let value = self.symbol_table.borrow().debug_symbol_table();
DebugResponse {
ask: ByStage { stage_name: format!("symbol-table"), token },
value
}
},
s => {
DebugResponse {
ask: ByStage { stage_name: s.to_string(), token: None },
value: format!("Not-implemented")
}
}
}
}
}
}
fn tokenizing(input: &str, _handle: &mut Schala, comp: Option<&mut PassDebugArtifact>) -> Result<Vec<tokenizing::Token>, String> {
let tokens = tokenizing::tokenize(input);
comp.map(|comp| {
let token_string = tokens.iter().map(|t| t.to_string_with_metadata()).join(", ");
comp.add_artifact(token_string);
});
let errors: Vec<String> = tokens.iter().filter_map(|t| t.get_error()).collect();
if errors.len() == 0 {
Ok(tokens)
} else {
Err(format!("{:?}", errors))
}
}
fn parsing(input: Vec<tokenizing::Token>, handle: &mut Schala, comp: Option<&mut PassDebugArtifact>) -> Result<ast::AST, String> {
use crate::parsing::Parser;
use ParsingDebugType::*;
let mut parser = handle.active_parser.take().unwrap_or_else(|| Parser::new(input));
let ast = parser.parse();
comp.map(|comp| {
let debug_format = comp.parsing.as_ref().unwrap_or(&CompactAST);
let debug_info = match debug_format {
CompactAST => match ast{
Ok(ref ast) => ast.compact_debug(),
Err(_) => "Error - see output".to_string(),
},
ExpandedAST => match ast{
Ok(ref ast) => ast.expanded_debug(),
Err(_) => "Error - see output".to_string(),
},
Trace => parser.format_parse_trace(),
};
comp.add_artifact(debug_info);
});
ast.map_err(|err| format_parse_error(err, handle))
}
fn format_parse_error(error: parsing::ParseError, handle: &mut Schala) -> String {
let line_num = error.token.line_num;
let ch = error.token.char_num;
let line_from_program = handle.source_reference.get_line(line_num);
let location_pointer = format!("{}^", " ".repeat(ch));
let line_num_digits = format!("{}", line_num).chars().count();
let space_padding = " ".repeat(line_num_digits);
format!(r#"
{error_msg}
{space_padding} |
{line_num} | {}
{space_padding} | {}
"#, line_from_program, location_pointer, error_msg=error.msg, space_padding=space_padding, line_num=line_num)
}
fn symbol_table(input: ast::AST, handle: &mut Schala, comp: Option<&mut PassDebugArtifact>) -> Result<ast::AST, String> {
let () = handle.symbol_table.borrow_mut().add_top_level_symbols(&input)?;
comp.map(|comp| {
let debug = handle.symbol_table.borrow().debug_symbol_table();
comp.add_artifact(debug);
});
Ok(input)
}
fn scope_resolution(mut input: ast::AST, handle: &mut Schala, _com: Option<&mut PassDebugArtifact>) -> Result<ast::AST, String> {
let mut symbol_table = handle.symbol_table.borrow_mut();
let mut resolver = crate::scope_resolution::ScopeResolver::new(&mut symbol_table);
let () = resolver.resolve(&mut input)?;
Ok(input)
}
fn typechecking(input: ast::AST, handle: &mut Schala, comp: Option<&mut PassDebugArtifact>) -> Result<ast::AST, String> {
let result = handle.type_context.typecheck(&input);
comp.map(|comp| {
comp.add_artifact(match result {
Ok(ty) => ty.to_string(),
Err(err) => format!("Type error: {}", err.msg)
});
});
Ok(input)
}
fn ast_reducing(input: ast::AST, handle: &mut Schala, comp: Option<&mut PassDebugArtifact>) -> Result<reduced_ast::ReducedAST, String> {
let ref symbol_table = handle.symbol_table.borrow();
let output = reduced_ast::reduce(&input, symbol_table);
comp.map(|comp| comp.add_artifact(format!("{:?}", output)));
Ok(output)
}
fn eval(input: reduced_ast::ReducedAST, handle: &mut Schala, comp: Option<&mut PassDebugArtifact>) -> Result<String, String> {
comp.map(|comp| comp.add_artifact(handle.state.debug_print()));
let evaluation_outputs = handle.state.evaluate(input, true);
let text_output: Result<Vec<String>, String> = evaluation_outputs
.into_iter()
.collect();
let eval_output: Result<String, String> = text_output
.map(|v| { v.into_iter().intersperse(format!("\n")).collect() });
eval_output
}
/// Represents lines of source code
struct SourceReference {
lines: Option<Vec<String>>
}
impl SourceReference {
fn new() -> SourceReference {
SourceReference { lines: None }
}
fn load_new_source(&mut self, source: &str) {
//TODO this is a lot of heap allocations - maybe there's a way to make it more efficient?
self.lines = Some(source.lines().map(|s| s.to_string()).collect()); }
fn get_line(&self, line: usize) -> String {
self.lines.as_ref().and_then(|x| x.get(line).map(|s| s.to_string())).unwrap_or(format!("NO LINE FOUND"))
}
}
enum ParsingDebugType {
CompactAST,
ExpandedAST,
Trace
}
#[derive(Default)]
struct PassDebugArtifact {
parsing: Option<ParsingDebugType>,
artifacts: Vec<String>
}
impl PassDebugArtifact {
fn add_artifact(&mut self, artifact: String) {
self.artifacts.push(artifact)
}
}
fn stage_names() -> Vec<&'static str> {
vec![
"tokenizing",
"parsing",
"symbol-table",
"scope-resolution",
"typechecking",
"ast-reduction",
"ast-walking-evaluation"
]
}
impl ProgrammingLanguageInterface for Schala {
fn get_language_name(&self) -> String { format!("Schala") }
fn get_source_file_suffix(&self) -> String { format!("schala") }
fn run_computation(&mut self, request: ComputationRequest) -> ComputationResponse {
struct PassToken<'a> {
schala: &'a mut Schala,
stage_durations: &'a mut Vec<(String, Duration)>,
sw: &'a Stopwatch,
debug_requests: &'a HashSet<DebugAsk>,
debug_responses: &'a mut Vec<DebugResponse>,
}
fn output_wrapper<Input, Output, F>(n: usize, func: F, input: Input, token: &mut PassToken) -> Result<Output, String>
where F: Fn(Input, &mut Schala, Option<&mut PassDebugArtifact>) -> Result<Output, String>
{
let stage_names = stage_names();
let cur_stage_name = stage_names[n];
let ask = token.debug_requests.iter().find(|ask| ask.is_for_stage(cur_stage_name));
let parsing = match ask {
Some(DebugAsk::ByStage { token, .. }) if cur_stage_name == "parsing" => Some(
token.as_ref().map(|token| match &token[..] {
"compact" => ParsingDebugType::CompactAST,
"expanded" => ParsingDebugType::ExpandedAST,
"trace" => ParsingDebugType::Trace,
_ => ParsingDebugType::CompactAST,
}).unwrap_or(ParsingDebugType::CompactAST)
),
_ => None,
};
let mut debug_artifact = ask.map(|_| PassDebugArtifact {
parsing, ..Default::default()
});
let output = func(input, token.schala, debug_artifact.as_mut());
//TODO I think this is not counting the time since the *previous* stage
token.stage_durations.push((cur_stage_name.to_string(), token.sw.elapsed()));
if let Some(artifact) = debug_artifact {
for value in artifact.artifacts.into_iter() {
let resp = DebugResponse { ask: ask.unwrap().clone(), value };
token.debug_responses.push(resp);
}
}
output
}
let ComputationRequest { source, debug_requests } = request;
self.source_reference.load_new_source(source);
let sw = Stopwatch::start_new();
let mut stage_durations = Vec::new();
let mut debug_responses = Vec::new();
let mut tok = PassToken { schala: self, stage_durations: &mut stage_durations, sw: &sw, debug_requests: &debug_requests, debug_responses: &mut debug_responses };
let main_output: Result<String, String> = Ok(source)
.and_then(|source| output_wrapper(0, tokenizing, source, &mut tok))
.and_then(|tokens| output_wrapper(1, parsing, tokens, &mut tok))
.and_then(|ast| output_wrapper(2, symbol_table, ast, &mut tok))
.and_then(|ast| output_wrapper(3, scope_resolution, ast, &mut tok))
.and_then(|ast| output_wrapper(4, typechecking, ast, &mut tok))
.and_then(|ast| output_wrapper(5, ast_reducing, ast, &mut tok))
.and_then(|reduced_ast| output_wrapper(6, eval, reduced_ast, &mut tok));
let total_duration = sw.elapsed();
let global_output_stats = GlobalOutputStats {
total_duration, stage_durations
};
ComputationResponse {
main_output,
global_output_stats,
debug_responses,
}
}
fn request_meta(&mut self, request: LangMetaRequest) -> LangMetaResponse {
match request {
LangMetaRequest::StageNames => LangMetaResponse::StageNames(stage_names().iter().map(|s| s.to_string()).collect()),
LangMetaRequest::Docs { source } => self.handle_docs(source),
LangMetaRequest::ImmediateDebug(debug_request) =>
LangMetaResponse::ImmediateDebug(self.handle_debug_immediate(debug_request)),
LangMetaRequest::Custom { .. } => LangMetaResponse::Custom { kind: format!("not-implemented"), value: format!("") }
}
}
}

View File

@@ -1,177 +0,0 @@
use crate::symbol_table::{SymbolTable, ScopeSegment, ScopeSegmentKind, FullyQualifiedSymbolName};
use crate::ast::*;
pub struct ScopeResolver<'a> {
symbol_table: &'a mut SymbolTable
}
impl<'a> ScopeResolver<'a> {
pub fn new(symbol_table: &'a mut SymbolTable) -> ScopeResolver {
ScopeResolver { symbol_table }
}
pub fn resolve(&mut self, ast: &mut AST) -> Result<(), String> {
for statement in ast.statements.iter() {
match statement.kind {
StatementKind::Declaration(ref decl) => self.decl(decl),
StatementKind::Expression(ref expr) => self.expr(expr),
}?;
}
Ok(())
}
fn decl(&mut self, decl: &Declaration) -> Result<(), String> {
use Declaration::*;
match decl {
Binding { expr, .. } => self.expr(expr),
FuncDecl(_, block) => self.block(block),
_ => Ok(()),
}
}
fn block(&mut self, block: &Block) -> Result<(), String> {
for statement in block.iter() {
match statement.kind {
StatementKind::Declaration(ref decl) => self.decl(decl),
StatementKind::Expression(ref expr) => self.expr(expr),
}?;
}
Ok(())
}
fn expr(&mut self, expr: &Expression) -> Result<(), String> {
use ExpressionKind::*;
match &expr.kind {
ExpressionKind::Value(qualified_name) => {
let fqsn = lookup_name_in_scope(&qualified_name);
let ref id = qualified_name.id;
self.symbol_table.map_id_to_fqsn(id, fqsn);
},
NamedStruct { name, .. } => {
let ref id = name.id;
let fqsn = lookup_name_in_scope(&name);
self.symbol_table.map_id_to_fqsn(id, fqsn);
},
BinExp(_, ref lhs, ref rhs) => {
self.expr(lhs)?;
self.expr(rhs)?;
},
PrefixExp(_, ref arg) => {
self.expr(arg)?;
},
TupleLiteral(exprs) => {
for expr in exprs.iter() {
self.expr(expr)?;
}
},
Call { f, arguments } => {
self.expr(&f)?;
for arg in arguments.iter() {
self.invoc(arg)?;
}
},
Lambda { params, body, .. } => {
self.block(&body)?;
for param in params.iter() {
if let Some(ref expr) = param.default {
self.expr(expr)?;
}
}
},
IfExpression { ref body, ref discriminator } => {
match &**discriminator {
Discriminator::Simple(expr) | Discriminator::BinOp(expr, _) => self.expr(expr)?
};
match &**body {
IfExpressionBody::SimplePatternMatch(ref pat, ref alt1, ref alt2) => {
self.pattern(pat)?;
self.block(alt1)?;
if let Some(alt) = alt2 {
self.block(alt)?;
}
},
IfExpressionBody::GuardList(guardarms) => {
for arm in guardarms.iter() {
if let Guard::Pat(ref pat) = arm.guard {
self.pattern(pat)?;
}
self.block(&arm.body)?;
}
}
_ => ()
}
},
_ => ()
};
Ok(())
}
fn invoc(&mut self, invoc: &InvocationArgument) -> Result<(), String> {
use InvocationArgument::*;
match invoc {
Positional(expr) => self.expr(expr),
Keyword { expr, .. } => self.expr(expr),
_ => Ok(())
}
}
fn pattern(&mut self, pat: &Pattern) -> Result<(), String> {
use Pattern::*;
match pat {
Ignored => (),
TuplePattern(patterns) => {
for pat in patterns {
self.pattern(pat)?;
}
},
Literal(_) => (),
TupleStruct(name, patterns) => {
self.qualified_name_in_pattern(name);
for pat in patterns {
self.pattern(pat)?;
}
},
Record(name, key_patterns) => {
self.qualified_name_in_pattern(name);
for (_, pat) in key_patterns {
self.pattern(pat)?;
}
},
VarOrName(name) => {
self.qualified_name_in_pattern(name);
},
};
Ok(())
}
/// this might be a variable or a pattern. if a variable, set to none
fn qualified_name_in_pattern(&mut self, qualified_name: &QualifiedName) {
let ref id = qualified_name.id;
let fqsn = lookup_name_in_scope(qualified_name);
if self.symbol_table.lookup_by_fqsn(&fqsn).is_some() {
self.symbol_table.map_id_to_fqsn(&id, fqsn);
}
}
}
//TODO this is incomplete
fn lookup_name_in_scope(sym_name: &QualifiedName) -> FullyQualifiedSymbolName {
let QualifiedName { components: vec, .. } = sym_name;
let len = vec.len();
let new_vec: Vec<ScopeSegment> = vec.iter().enumerate().map(|(i, name)| {
let kind = if i == (len - 1) {
ScopeSegmentKind::Terminal
} else {
ScopeSegmentKind::Type
};
ScopeSegment { name: name.clone(), kind }
}).collect();
FullyQualifiedSymbolName(new_vec)
}
#[cfg(test)]
mod tests {
#[test]
fn basic_scope() {
}
}

View File

@@ -1,464 +0,0 @@
use std::collections::HashMap;
use std::collections::hash_map::Entry;
use std::rc::Rc;
use std::fmt;
use std::fmt::Write;
use crate::ast;
use crate::ast::{ItemId, TypeBody, TypeSingletonName, Signature, Statement, StatementKind};
use crate::typechecking::TypeName;
type LineNumber = u32;
type SymbolTrackTable = HashMap<Rc<String>, LineNumber>;
#[derive(PartialEq, Eq, Hash, Debug, Clone)]
pub struct FullyQualifiedSymbolName(pub Vec<ScopeSegment>);
impl fmt::Display for FullyQualifiedSymbolName {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
let FullyQualifiedSymbolName(v) = self;
for segment in v {
write!(f, "::{}", segment)?;
}
Ok(())
}
}
#[derive(Debug, Clone, Eq, PartialEq, Hash)]
pub struct ScopeSegment {
pub name: Rc<String>, //TODO maybe this could be a &str, for efficiency?
pub kind: ScopeSegmentKind,
}
impl fmt::Display for ScopeSegment {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
use ScopeSegmentKind::*;
let kind = match self.kind {
Function => "fn",
Type => "ty",
Terminal => "tr",
};
write!(f, "{}({})", self.name, kind)
}
}
impl ScopeSegment {
pub fn new(name: Rc<String>, kind: ScopeSegmentKind) -> ScopeSegment {
ScopeSegment { name, kind }
}
}
#[derive(Debug, Clone, Eq, PartialEq, Hash)]
pub enum ScopeSegmentKind {
Function,
Type,
Terminal,
}
#[allow(unused_macros)]
macro_rules! fqsn {
( $( $name:expr ; $kind:tt),* ) => {
{
let mut vec = vec![];
$(
vec.push(ScopeSegment::new(
Rc::new($name.to_string()),
sym_path_kind!($kind),
));
)*
FullyQualifiedSymbolName(vec)
}
};
}
#[allow(unused_macros)]
macro_rules! sym_path_kind {
(fn) => { ScopeSegmentKind::Function };
(ty) => { ScopeSegmentKind::Type };
(tr) => { ScopeSegmentKind::Terminal };
}
//cf. p. 150 or so of Language Implementation Patterns
pub struct SymbolTable {
symbol_path_to_symbol: HashMap<FullyQualifiedSymbolName, Symbol>,
id_to_fqsn: HashMap<ItemId, FullyQualifiedSymbolName>,
}
//TODO add various types of lookups here, maybe multiple hash tables internally?
impl SymbolTable {
pub fn new() -> SymbolTable {
SymbolTable {
symbol_path_to_symbol: HashMap::new(),
id_to_fqsn: HashMap::new(),
}
}
pub fn map_id_to_fqsn(&mut self, id: &ItemId, fqsn: FullyQualifiedSymbolName) {
self.id_to_fqsn.insert(id.clone(), fqsn);
}
pub fn get_fqsn_from_id(&self, id: &ItemId) -> Option<FullyQualifiedSymbolName> {
self.id_to_fqsn.get(&id).cloned()
}
fn add_new_symbol(&mut self, name: &Rc<String>, scope_path: &Vec<ScopeSegment>, spec: SymbolSpec) {
let mut vec: Vec<ScopeSegment> = scope_path.clone();
vec.push(ScopeSegment { name: name.clone(), kind: ScopeSegmentKind::Terminal });
let fully_qualified_name = FullyQualifiedSymbolName(vec);
let symbol = Symbol { name: name.clone(), fully_qualified_name: fully_qualified_name.clone(), spec };
self.symbol_path_to_symbol.insert(fully_qualified_name, symbol);
}
pub fn lookup_by_fqsn(&self, fully_qualified_path: &FullyQualifiedSymbolName) -> Option<&Symbol> {
self.symbol_path_to_symbol.get(fully_qualified_path)
}
}
#[derive(Debug)]
pub struct Symbol {
pub name: Rc<String>, //TODO does this need to be pub?
fully_qualified_name: FullyQualifiedSymbolName,
pub spec: SymbolSpec,
}
impl fmt::Display for Symbol {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
write!(f, "<Name: {}, Spec: {}>", self.name, self.spec)
}
}
#[derive(Debug)]
pub enum SymbolSpec {
Func(Vec<TypeName>),
DataConstructor {
index: usize,
type_name: TypeName,
type_args: Vec<Rc<String>>,
},
RecordConstructor {
index: usize,
members: HashMap<Rc<String>, TypeName>,
type_name: TypeName,
},
Binding
}
impl fmt::Display for SymbolSpec {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
use self::SymbolSpec::*;
match self {
Func(type_names) => write!(f, "Func({:?})", type_names),
DataConstructor { index, type_name, type_args } => write!(f, "DataConstructor(idx: {})({:?} -> {})", index, type_args, type_name),
RecordConstructor { type_name, index, ..} => write!(f, "RecordConstructor(idx: {})(<members> -> {})", index, type_name),
Binding => write!(f, "Binding"),
}
}
}
impl SymbolTable {
/* note: this adds names for *forward reference* but doesn't actually create any types. solve that problem
* later */
pub fn add_top_level_symbols(&mut self, ast: &ast::AST) -> Result<(), String> {
let mut scope_name_stack = Vec::new();
self.add_symbols_from_scope(&ast.statements, &mut scope_name_stack)
}
fn add_symbols_from_scope<'a>(&'a mut self, statements: &Vec<Statement>, scope_name_stack: &mut Vec<ScopeSegment>) -> Result<(), String> {
use self::ast::Declaration::*;
fn insert_and_check_duplicate_symbol(table: &mut SymbolTrackTable, name: &Rc<String>) -> Result<(), String> {
match table.entry(name.clone()) {
Entry::Occupied(o) => {
let line_number = o.get(); //TODO make this actually work
Err(format!("Duplicate definition: {}. It's already defined at {}", name, line_number))
},
Entry::Vacant(v) => {
let line_number = 0; //TODO should work
v.insert(line_number);
Ok(())
}
}
}
let mut seen_identifiers: SymbolTrackTable = HashMap::new();
for statement in statements.iter() {
if let Statement { kind: StatementKind::Declaration(decl), .. } = statement {
match decl {
FuncSig(ref signature) => {
insert_and_check_duplicate_symbol(&mut seen_identifiers, &signature.name)?;
self.add_function_signature(signature, scope_name_stack)?
}
FuncDecl(ref signature, ref body) => {
insert_and_check_duplicate_symbol(&mut seen_identifiers, &signature.name)?;
self.add_function_signature(signature, scope_name_stack)?;
scope_name_stack.push(ScopeSegment{
name: signature.name.clone(),
kind: ScopeSegmentKind::Function,
});
let output = self.add_symbols_from_scope(body, scope_name_stack);
let _ = scope_name_stack.pop();
output?
},
TypeDecl { name, body, mutable } => {
insert_and_check_duplicate_symbol(&mut seen_identifiers, &name.name)?;
//TODO add ScopeSegmentKind::Type here
self.add_type_decl(name, body, mutable, scope_name_stack)?
},
Binding { name, .. } => {
insert_and_check_duplicate_symbol(&mut seen_identifiers, name)?;
self.add_new_symbol(name, scope_name_stack, SymbolSpec::Binding);
}
_ => ()
}
}
}
Ok(())
}
pub fn debug_symbol_table(&self) -> String {
let mut output = format!("Symbol table\n");
for (name, sym) in &self.symbol_path_to_symbol {
write!(output, "{} -> {}\n", name, sym).unwrap();
}
output
}
fn add_function_signature(&mut self, signature: &Signature, scope_name_stack: &mut Vec<ScopeSegment>) -> Result<(), String> {
let mut local_type_context = LocalTypeContext::new();
let types = signature.params.iter().map(|param| match param.anno {
Some(ref type_identifier) => Rc::new(format!("{:?}", type_identifier)),
None => local_type_context.new_universal_type()
}).collect();
self.add_new_symbol(&signature.name, scope_name_stack, SymbolSpec::Func(types));
Ok(())
}
//TODO handle type mutability
fn add_type_decl(&mut self, type_name: &TypeSingletonName, body: &TypeBody, _mutable: &bool, scope_name_stack: &mut Vec<ScopeSegment>) -> Result<(), String> {
use crate::ast::{TypeIdentifier, Variant};
let TypeBody(variants) = body;
let ref type_name = type_name.name;
scope_name_stack.push(ScopeSegment{
name: type_name.clone(),
kind: ScopeSegmentKind::Type,
});
//TODO figure out why _params isn't being used here
for (index, var) in variants.iter().enumerate() {
match var {
Variant::UnitStruct(variant_name) => {
let spec = SymbolSpec::DataConstructor {
index,
type_name: type_name.clone(),
type_args: vec![],
};
self.add_new_symbol(variant_name, scope_name_stack, spec);
},
Variant::TupleStruct(variant_name, tuple_members) => {
//TODO fix the notion of a tuple type
let type_args = tuple_members.iter().map(|type_name| match type_name {
TypeIdentifier::Singleton(TypeSingletonName { name, ..}) => name.clone(),
TypeIdentifier::Tuple(_) => unimplemented!(),
}).collect();
let spec = SymbolSpec::DataConstructor {
index,
type_name: type_name.clone(),
type_args
};
self.add_new_symbol(variant_name, scope_name_stack, spec);
},
Variant::Record { name, members: defined_members } => {
let mut members = HashMap::new();
let mut duplicate_member_definitions = Vec::new();
for (member_name, member_type) in defined_members {
match members.entry(member_name.clone()) {
Entry::Occupied(_) => duplicate_member_definitions.push(member_name.clone()),
Entry::Vacant(v) => {
v.insert(match member_type {
TypeIdentifier::Singleton(TypeSingletonName { name, ..}) => name.clone(),
TypeIdentifier::Tuple(_) => unimplemented!(),
});
}
}
}
if duplicate_member_definitions.len() != 0 {
return Err(format!("Duplicate member(s) in definition of type {}: {:?}", type_name, duplicate_member_definitions));
}
let spec = SymbolSpec::RecordConstructor { index, type_name: type_name.clone(), members };
self.add_new_symbol(name, scope_name_stack, spec);
},
}
}
scope_name_stack.pop();
Ok(())
}
}
struct LocalTypeContext {
state: u8
}
impl LocalTypeContext {
fn new() -> LocalTypeContext {
LocalTypeContext { state: 0 }
}
fn new_universal_type(&mut self) -> TypeName {
let n = self.state;
self.state += 1;
Rc::new(format!("{}", (('a' as u8) + n) as char))
}
}
#[cfg(test)]
mod symbol_table_tests {
use super::*;
use crate::util::quick_ast;
macro_rules! values_in_table {
//TODO multiple values
($source:expr, $single_value:expr) => {
{
let mut symbol_table = SymbolTable::new();
let ast = quick_ast($source);
symbol_table.add_top_level_symbols(&ast).unwrap();
match symbol_table.lookup_by_fqsn($single_value) {
Some(_spec) => (),
None => panic!(),
};
}
}
}
#[test]
fn basic_symbol_table() {
values_in_table! { "let a = 10; fn b() { 20 }", &fqsn!("b"; tr) }
}
#[test]
fn no_duplicates() {
let source = r#"
fn a() { 1 }
fn b() { 2 }
fn a() { 3 }
"#;
let mut symbol_table = SymbolTable::new();
let ast = quick_ast(source);
let output = symbol_table.add_top_level_symbols(&ast).unwrap_err();
assert!(output.contains("Duplicate"))
}
#[test]
fn no_duplicates_2() {
let source = r#"
let a = 20;
let q = 39;
let a = 30;
"#;
let mut symbol_table = SymbolTable::new();
let ast = quick_ast(source);
let output = symbol_table.add_top_level_symbols(&ast).unwrap_err();
assert!(output.contains("Duplicate"))
}
#[test]
fn no_duplicates_3() {
let source = r#"
fn a() {
let a = 20
let b = 40
a + b
}
fn q() {
let x = 30
let x = 33
}
"#;
let mut symbol_table = SymbolTable::new();
let ast = quick_ast(source);
let output = symbol_table.add_top_level_symbols(&ast).unwrap_err();
assert!(output.contains("Duplicate"))
}
#[test]
fn dont_falsely_detect_duplicates() {
let source = r#"
let a = 20;
fn some_func() {
let a = 40;
77
}
let q = 39;
"#;
let mut symbol_table = SymbolTable::new();
let ast = quick_ast(source);
symbol_table.add_top_level_symbols(&ast).unwrap();
assert!(symbol_table.lookup_by_fqsn(&fqsn!["a"; tr]).is_some());
assert!(symbol_table.lookup_by_fqsn(&fqsn!["some_func"; fn, "a";tr]).is_some());
}
#[test]
fn enclosing_scopes() {
let source = r#"
fn outer_func(x) {
fn inner_func(arg) {
arg
}
x + inner_func(x)
}"#;
let mut symbol_table = SymbolTable::new();
let ast = quick_ast(source);
symbol_table.add_top_level_symbols(&ast).unwrap();
assert!(symbol_table.lookup_by_fqsn(&fqsn!("outer_func"; tr)).is_some());
assert!(symbol_table.lookup_by_fqsn(&fqsn!("outer_func"; fn, "inner_func"; tr)).is_some());
}
#[test]
fn enclosing_scopes_2() {
let source = r#"
fn outer_func(x) {
fn inner_func(arg) {
arg
}
fn second_inner_func() {
fn another_inner_func() {
}
}
inner_func(x)
}"#;
let mut symbol_table = SymbolTable::new();
let ast = quick_ast(source);
symbol_table.add_top_level_symbols(&ast).unwrap();
assert!(symbol_table.lookup_by_fqsn(&fqsn!("outer_func"; tr)).is_some());
assert!(symbol_table.lookup_by_fqsn(&fqsn!("outer_func"; fn, "inner_func"; tr)).is_some());
assert!(symbol_table.lookup_by_fqsn(&fqsn!("outer_func"; fn, "second_inner_func"; tr)).is_some());
assert!(symbol_table.lookup_by_fqsn(&fqsn!("outer_func"; fn, "second_inner_func"; fn, "another_inner_func"; tr)).is_some());
}
#[test]
fn enclosing_scopes_3() {
let source = r#"
fn outer_func(x) {
fn inner_func(arg) {
arg
}
fn second_inner_func() {
fn another_inner_func() {
}
fn another_inner_func() {
}
}
inner_func(x)
}"#;
let mut symbol_table = SymbolTable::new();
let ast = quick_ast(source);
let output = symbol_table.add_top_level_symbols(&ast).unwrap_err();
assert!(output.contains("Duplicate"))
}
}

View File

@@ -1,484 +0,0 @@
use std::rc::Rc;
use std::fmt::Write;
use ena::unify::{UnifyKey, InPlaceUnificationTable, UnificationTable, EqUnifyValue};
use crate::ast::*;
use crate::util::ScopeStack;
#[derive(Debug, Clone, PartialEq)]
pub struct TypeData {
ty: Option<Type>
}
impl TypeData {
pub fn new() -> TypeData {
TypeData { ty: None }
}
}
pub type TypeName = Rc<String>;
pub struct TypeContext<'a> {
variable_map: ScopeStack<'a, Rc<String>, Type>,
unification_table: InPlaceUnificationTable<TypeVar>,
}
/// `InferResult` is the monad in which type inference takes place.
type InferResult<T> = Result<T, TypeError>;
#[derive(Debug, Clone)]
pub struct TypeError { pub msg: String }
impl TypeError {
fn new<A, T>(msg: T) -> InferResult<A> where T: Into<String> {
Err(TypeError { msg: msg.into() })
}
}
#[allow(dead_code)] // avoids warning from Compound
#[derive(Debug, Clone, PartialEq)]
pub enum Type {
Const(TypeConst),
Var(TypeVar),
Arrow {
params: Vec<Type>,
ret: Box<Type>
},
Compound {
ty_name: String,
args:Vec<Type>
}
}
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
pub struct TypeVar(usize);
impl UnifyKey for TypeVar {
type Value = Option<TypeConst>;
fn index(&self) -> u32 { self.0 as u32 }
fn from_index(u: u32) -> TypeVar { TypeVar(u as usize) }
fn tag() -> &'static str { "TypeVar" }
}
#[derive(Debug, Clone, PartialEq, Eq)]
pub enum TypeConst {
Unit,
Nat,
Int,
Float,
StringT,
Bool,
Ordering,
//UserDefined
}
impl TypeConst {
pub fn to_string(&self) -> String {
use self::TypeConst::*;
match self {
Unit => format!("()"),
Nat => format!("Nat"),
Int => format!("Int"),
Float => format!("Float"),
StringT => format!("String"),
Bool => format!("Bool"),
Ordering => format!("Ordering"),
}
}
}
impl EqUnifyValue for TypeConst { }
macro_rules! ty {
($type_name:ident) => { Type::Const(TypeConst::$type_name) };
($t1:ident -> $t2:ident) => { Type::Arrow { params: vec![ty!($t1)], ret: box ty!($t2) } };
($t1:ident -> $t2:ident -> $t3:ident) => { Type::Arrow { params: vec![ty!($t1), ty!($t2)], ret: box ty!($t3) } };
($type_list:ident, $ret_type:ident) => {
Type::Arrow {
params: $type_list,
ret: box $ret_type,
}
}
}
//TODO find a better way to capture the to/from string logic
impl Type {
pub fn to_string(&self) -> String {
use self::Type::*;
match self {
Const(c) => c.to_string(),
Var(v) => format!("t_{}", v.0),
Arrow { params, box ref ret } => {
if params.len() == 0 {
format!("-> {}", ret.to_string())
} else {
let mut buf = String::new();
for p in params.iter() {
write!(buf, "{} -> ", p.to_string()).unwrap();
}
write!(buf, "{}", ret.to_string()).unwrap();
buf
}
},
Compound { .. } => format!("<some compound type>")
}
}
fn from_string(string: &str) -> Option<Type> {
Some(match string {
"()" | "Unit" => ty!(Unit),
"Nat" => ty!(Nat),
"Int" => ty!(Int),
"Float" => ty!(Float),
"String" => ty!(StringT),
"Bool" => ty!(Bool),
"Ordering" => ty!(Ordering),
_ => return None
})
}
}
/*
/// `Type` is parameterized by whether the type variables can be just universal, or universal or
/// existential.
#[derive(Debug, Clone)]
enum Type<A> {
Var(A),
Const(TConst),
Arrow(Box<Type<A>>, Box<Type<A>>),
}
#[derive(Debug, Clone)]
enum TVar {
Univ(UVar),
Exist(ExistentialVar)
}
#[derive(Debug, Clone)]
struct UVar(Rc<String>);
#[derive(Debug, Clone)]
struct ExistentialVar(u32);
impl Type<UVar> {
fn to_tvar(&self) -> Type<TVar> {
match self {
Type::Var(UVar(name)) => Type::Var(TVar::Univ(UVar(name.clone()))),
Type::Const(ref c) => Type::Const(c.clone()),
Type::Arrow(a, b) => Type::Arrow(
Box::new(a.to_tvar()),
Box::new(b.to_tvar())
)
}
}
}
impl Type<TVar> {
fn skolemize(&self) -> Type<UVar> {
match self {
Type::Var(TVar::Univ(uvar)) => Type::Var(uvar.clone()),
Type::Var(TVar::Exist(_)) => Type::Var(UVar(Rc::new(format!("sk")))),
Type::Const(ref c) => Type::Const(c.clone()),
Type::Arrow(a, b) => Type::Arrow(
Box::new(a.skolemize()),
Box::new(b.skolemize())
)
}
}
}
impl TypeIdentifier {
fn to_monotype(&self) -> Type<UVar> {
match self {
TypeIdentifier::Tuple(_) => Type::Const(TConst::Nat),
TypeIdentifier::Singleton(TypeSingletonName { name, .. }) => {
match &name[..] {
"Nat" => Type::Const(TConst::Nat),
"Int" => Type::Const(TConst::Int),
"Float" => Type::Const(TConst::Float),
"Bool" => Type::Const(TConst::Bool),
"String" => Type::Const(TConst::StringT),
_ => Type::Const(TConst::Nat),
}
}
}
}
}
#[derive(Debug, Clone)]
enum TConst {
User(Rc<String>),
Unit,
Nat,
Int,
Float,
StringT,
Bool,
}
impl TConst {
fn user(name: &str) -> TConst {
TConst::User(Rc::new(name.to_string()))
}
}
*/
impl<'a> TypeContext<'a> {
pub fn new() -> TypeContext<'a> {
TypeContext {
variable_map: ScopeStack::new(None),
unification_table: UnificationTable::new(),
}
}
/*
fn new_env(&'a self, new_var: Rc<String>, ty: Type) -> TypeContext<'a> {
let mut new_context = TypeContext {
variable_map: self.variable_map.new_scope(None),
unification_table: UnificationTable::new(), //???? not sure if i want this
};
new_context.variable_map.insert(new_var, ty);
new_context
}
*/
fn get_type_from_name(&self, name: &TypeIdentifier) -> InferResult<Type> {
use self::TypeIdentifier::*;
Ok(match name {
Singleton(TypeSingletonName { name,.. }) => {
match Type::from_string(&name) {
Some(ty) => ty,
None => return TypeError::new(format!("Unknown type name: {}", name))
}
},
Tuple(_) => return TypeError::new("tuples aren't ready yet"),
})
}
/// `typecheck` is the entry into the type-inference system, accepting an AST as an argument
/// Following the example of GHC, the compiler deliberately does typechecking before de-sugaring
/// the AST to ReducedAST
pub fn typecheck(&mut self, ast: &AST) -> Result<Type, TypeError> {
let mut returned_type = Type::Const(TypeConst::Unit);
for statement in ast.statements.iter() {
returned_type = self.statement(statement)?;
}
Ok(returned_type)
}
fn statement(&mut self, statement: &Statement) -> InferResult<Type> {
match &statement.kind {
StatementKind::Expression(e) => self.expr(e),
StatementKind::Declaration(decl) => self.decl(&decl),
}
}
fn decl(&mut self, decl: &Declaration) -> InferResult<Type> {
use self::Declaration::*;
match decl {
Binding { name, expr, .. } => {
let ty = self.expr(expr)?;
self.variable_map.insert(name.clone(), ty);
},
_ => (),
}
Ok(ty!(Unit))
}
fn invoc(&mut self, invoc: &InvocationArgument) -> InferResult<Type> {
use InvocationArgument::*;
match invoc {
Positional(expr) => self.expr(expr),
_ => Ok(ty!(Nat)) //TODO this is wrong
}
}
fn expr(&mut self, expr: &Expression) -> InferResult<Type> {
match expr {
Expression { kind, type_anno: Some(anno), .. } => {
let t1 = self.expr_type(kind)?;
let t2 = self.get_type_from_name(anno)?;
self.unify(t2, t1)
},
Expression { kind, type_anno: None, .. } => self.expr_type(kind)
}
}
fn expr_type(&mut self, expr: &ExpressionKind) -> InferResult<Type> {
use self::ExpressionKind::*;
Ok(match expr {
NatLiteral(_) => ty!(Nat),
BoolLiteral(_) => ty!(Bool),
FloatLiteral(_) => ty!(Float),
StringLiteral(_) => ty!(StringT),
PrefixExp(op, expr) => self.prefix(op, expr)?,
BinExp(op, lhs, rhs) => self.binexp(op, lhs, rhs)?,
IfExpression { discriminator, body } => self.if_expr(discriminator, body)?,
Value(val) => self.handle_value(val)?,
Call { box ref f, arguments } => self.call(f, arguments)?,
Lambda { params, type_anno, body } => self.lambda(params, type_anno, body)?,
_ => ty!(Unit),
})
}
fn prefix(&mut self, op: &PrefixOp, expr: &Expression) -> InferResult<Type> {
let tf = match op.builtin.map(|b| b.get_type()) {
Some(ty) => ty,
None => return TypeError::new("no type found")
};
let tx = self.expr(expr)?;
self.handle_apply(tf, vec![tx])
}
fn binexp(&mut self, op: &BinOp, lhs: &Expression, rhs: &Expression) -> InferResult<Type> {
let tf = match op.builtin.map(|b| b.get_type()) {
Some(ty) => ty,
None => return TypeError::new("no type found"),
};
let t_lhs = self.expr(lhs)?;
let t_rhs = self.expr(rhs)?; //TODO is this order a problem? not sure
self.handle_apply(tf, vec![t_lhs, t_rhs])
}
fn if_expr(&mut self, discriminator: &Discriminator, body: &IfExpressionBody) -> InferResult<Type> {
use self::Discriminator::*; use self::IfExpressionBody::*;
match (discriminator, body) {
(Simple(expr), SimpleConditional(then_clause, else_clause)) => self.handle_simple_if(expr, then_clause, else_clause),
_ => TypeError::new(format!("Complex conditionals not supported"))
}
}
fn handle_simple_if(&mut self, expr: &Expression, then_clause: &Block, else_clause: &Option<Block>) -> InferResult<Type> {
let t1 = self.expr(expr)?;
let t2 = self.block(then_clause)?;
let t3 = match else_clause {
Some(block) => self.block(block)?,
None => ty!(Unit)
};
let _ = self.unify(ty!(Bool), t1)?;
self.unify(t2, t3)
}
fn lambda(&mut self, params: &Vec<FormalParam>, type_anno: &Option<TypeIdentifier>, _body: &Block) -> InferResult<Type> {
let argument_types: InferResult<Vec<Type>> = params.iter().map(|param: &FormalParam| {
if let FormalParam { anno: Some(type_identifier), .. } = param {
self.get_type_from_name(type_identifier)
} else {
Ok(Type::Var(self.fresh_type_variable()))
}
}).collect();
let argument_types = argument_types?;
let ret_type = match type_anno.as_ref() {
Some(anno) => self.get_type_from_name(anno)?,
None => Type::Var(self.fresh_type_variable())
};
Ok(ty!(argument_types, ret_type))
}
fn call(&mut self, f: &Expression, args: &Vec<InvocationArgument>) -> InferResult<Type> {
let tf = self.expr(f)?;
let arg_types: InferResult<Vec<Type>> = args.iter().map(|ex| self.invoc(ex)).collect();
let arg_types = arg_types?;
self.handle_apply(tf, arg_types)
}
fn handle_apply(&mut self, tf: Type, args: Vec<Type>) -> InferResult<Type> {
Ok(match tf {
Type::Arrow { ref params, ret: box ref t_ret } if params.len() == args.len() => {
for (t_param, t_arg) in params.iter().zip(args.iter()) {
let _ = self.unify(t_param.clone(), t_arg.clone())?; //TODO I think this needs to reference a sub-scope
}
t_ret.clone()
},
Type::Arrow { .. } => return TypeError::new("Wrong length"),
_ => return TypeError::new(format!("Not a function"))
})
}
fn block(&mut self, block: &Block) -> InferResult<Type> {
let mut output = ty!(Unit);
for statement in block.iter() {
output = self.statement(statement)?;
}
Ok(output)
}
fn handle_value(&mut self, val: &QualifiedName) -> InferResult<Type> {
let QualifiedName { components: vec, .. } = val;
let var = &vec[0];
match self.variable_map.lookup(var) {
Some(ty) => Ok(ty.clone()),
None => TypeError::new(format!("Couldn't find variable: {}", &var)),
}
}
fn unify(&mut self, t1: Type, t2: Type) -> InferResult<Type> {
use self::Type::*;
match (t1, t2) {
(Const(ref c1), Const(ref c2)) if c1 == c2 => Ok(Const(c1.clone())), //choice of c1 is arbitrary I *think*
(a @ Var(_), b @ Const(_)) => self.unify(b, a),
(Const(ref c1), Var(ref v2)) => {
self.unification_table.unify_var_value(v2.clone(), Some(c1.clone()))
.or_else(|_| TypeError::new(format!("Couldn't unify {:?} and {:?}", Const(c1.clone()), Var(*v2))))?;
Ok(Const(c1.clone()))
},
(Var(v1), Var(v2)) => {
//TODO add occurs check
self.unification_table.unify_var_var(v1.clone(), v2.clone())
.or_else(|e| {
println!("Unify error: {:?}", e);
TypeError::new(format!("Two type variables {:?} and {:?} couldn't unify", v1, v2))
})?;
Ok(Var(v1.clone())) //arbitrary decision I think
},
(a, b) => TypeError::new(format!("{:?} and {:?} do not unify", a, b)),
}
}
fn fresh_type_variable(&mut self) -> TypeVar {
let new_type_var = self.unification_table.new_key(None);
new_type_var
}
}
#[cfg(test)]
mod typechecking_tests {
use super::*;
macro_rules! assert_type_in_fresh_context {
($string:expr, $type:expr) => {
let mut tc = TypeContext::new();
let ref ast = crate::util::quick_ast($string);
let ty = tc.typecheck(ast).unwrap();
assert_eq!(ty, $type)
}
}
#[test]
fn basic_test() {
assert_type_in_fresh_context!("1", ty!(Nat));
assert_type_in_fresh_context!(r#""drugs""#, ty!(StringT));
assert_type_in_fresh_context!("true", ty!(Bool));
}
#[test]
fn operators() {
//TODO fix these with new operator regime
/*
assert_type_in_fresh_context!("-1", ty!(Int));
assert_type_in_fresh_context!("1 + 2", ty!(Nat));
assert_type_in_fresh_context!("-2", ty!(Int));
assert_type_in_fresh_context!("!true", ty!(Bool));
*/
}
}

180
schala-lang/src/ast.rs Normal file
View File

@@ -0,0 +1,180 @@
use std::rc::Rc;
use builtin::{BinOp, PrefixOp};
#[derive(Debug, PartialEq)]
pub struct AST(pub Vec<Statement>);
#[derive(Debug, PartialEq, Clone)]
pub enum Statement {
ExpressionStatement(Expression),
Declaration(Declaration),
}
pub type Block = Vec<Statement>;
pub type ParamName = Rc<String>;
pub type InterfaceName = Rc<String>; //should be a singleton I think??
pub type FormalParam = (ParamName, Option<TypeName>);
#[derive(Debug, PartialEq, Clone)]
pub enum Declaration {
FuncSig(Signature),
FuncDecl(Signature, Block),
TypeDecl {
name: TypeSingletonName,
body: TypeBody,
mutable: bool
},
TypeAlias(Rc<String>, Rc<String>), //should have TypeSingletonName in it, or maybe just String, not sure
Binding {
name: Rc<String>,
constant: bool,
expr: Expression,
},
Impl {
type_name: TypeName,
interface_name: Option<InterfaceName>,
block: Vec<Declaration>,
},
Interface {
name: Rc<String>,
signatures: Vec<Signature>
}
}
#[derive(Debug, PartialEq, Clone)]
pub struct Signature {
pub name: Rc<String>,
pub operator: bool,
pub params: Vec<FormalParam>,
pub type_anno: Option<TypeName>,
}
#[derive(Debug, PartialEq, Clone)]
pub struct TypeBody(pub Vec<Variant>);
#[derive(Debug, PartialEq, Clone)]
pub enum Variant {
UnitStruct(Rc<String>),
TupleStruct(Rc<String>, Vec<TypeName>),
Record(Rc<String>, Vec<(Rc<String>, TypeName)>),
}
#[derive(Debug, PartialEq, Clone)]
pub struct Expression(pub ExpressionType, pub Option<TypeName>);
#[derive(Debug, PartialEq, Clone)]
pub enum TypeName {
Tuple(Vec<TypeName>),
Singleton(TypeSingletonName)
}
#[derive(Debug, PartialEq, Clone)]
pub struct TypeSingletonName {
pub name: Rc<String>,
pub params: Vec<TypeName>,
}
#[derive(Debug, PartialEq, Clone)]
pub enum ExpressionType {
NatLiteral(u64),
FloatLiteral(f64),
StringLiteral(Rc<String>),
BoolLiteral(bool),
BinExp(BinOp, Box<Expression>, Box<Expression>),
PrefixExp(PrefixOp, Box<Expression>),
TupleLiteral(Vec<Expression>),
Value(Rc<String>),
NamedStruct {
name: Rc<String>,
fields: Vec<(Rc<String>, Expression)>,
},
Call {
f: Box<Expression>,
arguments: Vec<Expression>,
},
Index {
indexee: Box<Expression>,
indexers: Vec<Expression>,
},
IfExpression {
discriminator: Box<Discriminator>,
body: Box<IfExpressionBody>,
},
WhileExpression {
condition: Option<Box<Expression>>,
body: Block,
},
ForExpression {
enumerators: Vec<Enumerator>,
body: Box<ForBody>,
},
Lambda {
params: Vec<FormalParam>,
body: Block,
},
ListLiteral(Vec<Expression>),
}
#[derive(Debug, PartialEq, Clone)]
pub enum Discriminator {
Simple(Expression),
BinOp(Expression, BinOp)
}
#[derive(Debug, PartialEq, Clone)]
pub enum IfExpressionBody {
SimpleConditional(Block, Option<Block>),
SimplePatternMatch(Pattern, Block, Option<Block>),
GuardList(Vec<GuardArm>)
}
#[derive(Debug, PartialEq, Clone)]
pub struct GuardArm {
pub guard: Guard,
pub body: Block,
}
#[derive(Debug, PartialEq, Clone)]
pub enum Guard {
Pat(Pattern),
HalfExpr(HalfExpr)
}
#[derive(Debug, PartialEq, Clone)]
pub struct HalfExpr {
pub op: Option<BinOp>,
pub expr: ExpressionType,
}
#[derive(Debug, PartialEq, Clone)]
pub enum Pattern {
Ignored,
TuplePattern(Vec<Pattern>),
Literal(PatternLiteral),
TupleStruct(Rc<String>, Vec<Pattern>),
Record(Rc<String>, Vec<(Rc<String>, Pattern)>),
}
#[derive(Debug, PartialEq, Clone)]
pub enum PatternLiteral {
NumPattern {
neg: bool,
num: ExpressionType,
},
StringPattern(Rc<String>),
BoolPattern(bool),
VarPattern(Rc<String>)
}
#[derive(Debug, PartialEq, Clone)]
pub struct Enumerator {
pub id: Rc<String>,
pub generator: Expression,
}
#[derive(Debug, PartialEq, Clone)]
pub enum ForBody {
MonadicReturn(Expression),
StatementBlock(Block),
}

138
schala-lang/src/builtin.rs Normal file
View File

@@ -0,0 +1,138 @@
use std::rc::Rc;
use std::collections::HashMap;
use std::fmt;
use tokenizing::TokenType;
use self::BuiltinTypeSpecifier::*;
use self::BuiltinTConst::*;
#[derive(Debug, PartialEq, Clone)]
pub enum BuiltinTypeSpecifier {
Const(BuiltinTConst),
Func(Box<BuiltinTypeSpecifier>, Box<BuiltinTypeSpecifier>),
}
#[derive(Debug, PartialEq, Clone)]
pub enum BuiltinTConst {
Nat,
Int,
Float,
StringT,
Bool,
}
impl fmt::Display for BuiltinTypeSpecifier {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
write!(f, "{:?}", self)
}
}
#[derive(Debug, PartialEq, Clone)]
pub struct BinOp {
sigil: Rc<String>
}
impl BinOp {
pub fn from_sigil(sigil: &str) -> BinOp {
BinOp { sigil: Rc::new(sigil.to_string()) }
}
pub fn sigil(&self) -> &Rc<String> {
&self.sigil
}
pub fn from_sigil_token(tok: &TokenType) -> Option<BinOp> {
use self::TokenType::*;
let s = match tok {
Operator(op) => op,
Period => ".",
Pipe => "|",
Slash => "/",
LAngleBracket => "<",
RAngleBracket => ">",
_ => return None
};
Some(BinOp::from_sigil(s))
}
/*
pub fn get_type(&self) -> Result<Type, String> {
let s = self.sigil.as_str();
BINOPS.get(s).map(|x| x.0.clone()).ok_or(format!("Binop {} not found", s))
}
*/
pub fn min_precedence() -> i32 {
i32::min_value()
}
pub fn get_precedence_from_token(op: &TokenType) -> Option<i32> {
use self::TokenType::*;
let s = match op {
Operator(op) => op,
Period => ".",
Pipe => "|",
Slash => "/",
LAngleBracket => "<",
RAngleBracket => ">",
_ => return None
};
let default = 10_000_000;
Some(BINOPS.get(s).map(|x| x.2.clone()).unwrap_or_else(|| {
println!("Warning: operator {} not defined", s);
default
}))
}
}
#[derive(Debug, PartialEq, Clone)]
pub struct PrefixOp {
sigil: Rc<String>
}
impl PrefixOp {
pub fn from_sigil(sigil: &str) -> PrefixOp {
PrefixOp { sigil: Rc::new(sigil.to_string()) }
}
pub fn sigil(&self) -> &Rc<String> {
&self.sigil
}
pub fn is_prefix(op: &str) -> bool {
PREFIX_OPS.get(op).is_some()
}
/*
pub fn get_type(&self) -> Result<Type, String> {
let s = self.sigil.as_str();
PREFIX_OPS.get(s).map(|x| x.0.clone()).ok_or(format!("Prefix op {} not found", s))
}
*/
}
lazy_static! {
static ref PREFIX_OPS: HashMap<&'static str, (BuiltinTypeSpecifier, ())> =
hashmap! {
"+" => (Func(bx!(Const(Int)), bx!(Const(Int))), ()),
"-" => (Func(bx!(Const(Int)), bx!(Const(Int))), ()),
"!" => (Func(bx!(Const(Bool)), bx!(Const(Bool))), ()),
};
}
/* the second tuple member is a placeholder for when I want to make evaluation rules tied to the
* binop definition */
lazy_static! {
static ref BINOPS: HashMap<&'static str, (BuiltinTypeSpecifier, (), i32)> =
hashmap! {
"+" => (Func(bx!(Const(Nat)), bx!(Func(bx!(Const(Nat)), bx!(Const(Nat))))), (), 10),
"-" => (Func(bx!(Const(Nat)), bx!(Func(bx!(Const(Nat)), bx!(Const(Nat))))), (), 10),
"*" => (Func(bx!(Const(Nat)), bx!(Func(bx!(Const(Nat)), bx!(Const(Nat))))), (), 20),
"/" => (Func(bx!(Const(Nat)), bx!(Func(bx!(Const(Nat)), bx!(Const(Float))))), (), 20),
"//" => (Func(bx!(Const(Nat)), bx!(Func(bx!(Const(Nat)), bx!(Const(Nat))))), (), 20), //TODO change this to `quot`
"%" => (Func(bx!(Const(Nat)), bx!(Func(bx!(Const(Nat)), bx!(Const(Nat))))), (), 20),
"++" => (Func(bx!(Const(StringT)), bx!(Func(bx!(Const(StringT)), bx!(Const(StringT))))), (), 30),
"^" => (Func(bx!(Const(Nat)), bx!(Func(bx!(Const(Nat)), bx!(Const(Nat))))), (), 20),
"&" => (Func(bx!(Const(Nat)), bx!(Func(bx!(Const(Nat)), bx!(Const(Nat))))), (), 20),
"|" => (Func(bx!(Const(Nat)), bx!(Func(bx!(Const(Nat)), bx!(Const(Nat))))), (), 20),
">" => (Func(bx!(Const(Nat)), bx!(Func(bx!(Const(Nat)), bx!(Const(Nat))))), (), 20),
">=" => (Func(bx!(Const(Nat)), bx!(Func(bx!(Const(Nat)), bx!(Const(Nat))))), (), 20),
"<" => (Func(bx!(Const(Nat)), bx!(Func(bx!(Const(Nat)), bx!(Const(Nat))))), (), 20),
"<=" => (Func(bx!(Const(Nat)), bx!(Func(bx!(Const(Nat)), bx!(Const(Nat))))), (), 20),
"==" => (Func(bx!(Const(Nat)), bx!(Func(bx!(Const(Nat)), bx!(Const(Nat))))), (), 20),
"=" => (Func(bx!(Const(Nat)), bx!(Func(bx!(Const(Nat)), bx!(Const(Nat))))), (), 20),
"<=>" => (Func(bx!(Const(Nat)), bx!(Func(bx!(Const(Nat)), bx!(Const(Nat))))), (), 20),
};
}

482
schala-lang/src/eval.rs Normal file
View File

@@ -0,0 +1,482 @@
use std::cell::RefCell;
use std::rc::Rc;
use std::fmt::Write;
use std::io;
use itertools::Itertools;
use util::ScopeStack;
use reduced_ast::{ReducedAST, Stmt, Expr, Lit, Func};
use symbol_table::{SymbolSpec, Symbol, SymbolTable};
pub struct State<'a> {
values: ScopeStack<'a, Rc<String>, ValueEntry>,
symbol_table_handle: Rc<RefCell<SymbolTable>>,
}
macro_rules! builtin_binding {
($name:expr, $values:expr) => {
$values.insert(Rc::new(format!($name)), ValueEntry::Binding { constant: true, val: Node::Expr(Expr::Func(Func::BuiltIn(Rc::new(format!($name))))) });
}
}
//TODO add a more concise way of getting a new frame
impl<'a> State<'a> {
pub fn new(symbol_table_handle: Rc<RefCell<SymbolTable>>) -> State<'a> {
let mut values = ScopeStack::new(Some(format!("global")));
builtin_binding!("print", values);
builtin_binding!("println", values);
builtin_binding!("getline", values);
State { values, symbol_table_handle }
}
pub fn debug_print(&self) -> String {
format!("Values: {:?}", self.values)
}
}
#[derive(Debug, Clone)]
enum Node {
Expr(Expr),
PrimObject {
name: Rc<String>,
tag: usize,
items: Vec<Node>,
},
PrimTuple {
items: Vec<Node>
}
}
fn paren_wrapped_vec(terms: impl Iterator<Item=String>) -> String {
let mut buf = String::new();
write!(buf, "(").unwrap();
for term in terms.map(|e| Some(e)).intersperse(None) {
match term {
Some(e) => write!(buf, "{}", e).unwrap(),
None => write!(buf, ", ").unwrap(),
};
}
write!(buf, ")").unwrap();
buf
}
impl Node {
fn to_repl(&self) -> String {
match self {
Node::Expr(e) => e.to_repl(),
Node::PrimObject { name, items, .. } if items.len() == 0 => format!("{}", name),
Node::PrimObject { name, items, .. } => format!("{}{}", name, paren_wrapped_vec(items.iter().map(|x| x.to_repl()))),
Node::PrimTuple { items } => format!("{}", paren_wrapped_vec(items.iter().map(|x| x.to_repl()))),
}
}
}
#[derive(Debug)]
enum ValueEntry {
Binding {
constant: bool,
val: /*FullyEvaluatedExpr*/ Node, //TODO make this use a subtype to represent fully evaluatedness
}
}
type EvalResult<T> = Result<T, String>;
impl Expr {
fn to_node(self) -> Node {
Node::Expr(self)
}
fn to_repl(&self) -> String {
use self::Lit::*;
use self::Func::*;
match self {
Expr::Lit(ref l) => match l {
Nat(n) => format!("{}", n),
Int(i) => format!("{}", i),
Float(f) => format!("{}", f),
Bool(b) => format!("{}", b),
StringLit(s) => format!("\"{}\"", s),
},
Expr::Func(f) => match f {
BuiltIn(name) => format!("<built-in function '{}'>", name),
UserDefined { name: None, .. } => format!("<function>"),
UserDefined { name: Some(name), .. } => format!("<function '{}'>", name),
},
Expr::Constructor {
type_name: _, name, tag, arity,
} => if *arity == 0 {
format!("{}", name)
} else {
format!("<data constructor '{}'>", name)
},
Expr::Tuple(exprs) => paren_wrapped_vec(exprs.iter().map(|x| x.to_repl())),
_ => format!("{:?}", self),
}
}
}
impl<'a> State<'a> {
pub fn evaluate(&mut self, ast: ReducedAST, repl: bool) -> Vec<Result<String, String>> {
let mut acc = vec![];
// handle prebindings
for statement in ast.0.iter() {
self.prebinding(statement);
}
for statement in ast.0 {
match self.statement(statement) {
Ok(Some(ref output)) if repl => acc.push(Ok(output.to_repl())),
Ok(_) => (),
Err(error) => {
acc.push(Err(format!("Runtime error: {}", error)));
return acc;
},
}
}
acc
}
fn prebinding(&mut self, stmt: &Stmt) {
match stmt {
Stmt::PreBinding { name, func } => {
let v_entry = ValueEntry::Binding { constant: true, val: Node::Expr(Expr::Func(func.clone())) };
self.values.insert(name.clone(), v_entry);
},
Stmt::Expr(_expr) => {
//TODO have this support things like nested function defs
},
_ => ()
}
}
fn statement(&mut self, stmt: Stmt) -> EvalResult<Option<Node>> {
match stmt {
Stmt::Binding { name, constant, expr } => {
let val = self.expression(Node::Expr(expr))?;
self.values.insert(name.clone(), ValueEntry::Binding { constant, val });
Ok(None)
},
Stmt::Expr(expr) => Ok(Some(self.expression(expr.to_node())?)),
Stmt::PreBinding {..} | Stmt::Noop => Ok(None),
}
}
fn block(&mut self, stmts: Vec<Stmt>) -> EvalResult<Node> {
let mut ret = None;
for stmt in stmts {
ret = self.statement(stmt)?;
}
Ok(ret.unwrap_or(Node::Expr(Expr::Unit)))
}
fn expression(&mut self, node: Node) -> EvalResult<Node> {
use self::Expr::*;
match node {
t @ Node::PrimTuple { .. } => Ok(t),
obj @ Node::PrimObject { .. } => Ok(obj),
Node::Expr(expr) => match expr {
literal @ Lit(_) => Ok(Node::Expr(literal)),
Call { box f, args } => match self.expression(Node::Expr(f))? {
Node::Expr(Constructor { type_name, name, tag, arity }) => self.apply_data_constructor(type_name, name, tag, arity, args),
Node::Expr(Func(f)) => self.apply_function(f, args),
other => return Err(format!("Tried to call {:?} which is not a function or data constructor", other)),
},
Val(v) => self.value(v),
Constructor { arity, ref name, tag, .. } if arity == 0 => Ok(Node::PrimObject { name: name.clone(), tag, items: vec![] }),
constructor @ Constructor { .. } => Ok(Node::Expr(constructor)),
func @ Func(_) => Ok(Node::Expr(func)),
Tuple(exprs) => {
let nodes = exprs.into_iter().map(|expr| self.expression(Node::Expr(expr))).collect::<Result<Vec<Node>,_>>()?;
Ok(Node::PrimTuple { items: nodes })
},
Conditional { box cond, then_clause, else_clause } => self.conditional(cond, then_clause, else_clause),
Assign { box val, box expr } => {
let name = match val {
Expr::Val(name) => name,
_ => return Err(format!("Trying to assign to a non-value")),
};
let constant = match self.values.lookup(&name) {
None => return Err(format!("{} is undefined", name)),
Some(ValueEntry::Binding { constant, .. }) => constant.clone(),
};
if constant {
return Err(format!("trying to update {}, a non-mutable binding", name));
}
let val = self.expression(Node::Expr(expr))?;
self.values.insert(name.clone(), ValueEntry::Binding { constant: false, val });
Ok(Node::Expr(Expr::Unit))
},
Unit => Ok(Node::Expr(Unit)),
CaseMatch { box cond, alternatives } => match self.expression(Node::Expr(cond))? {
Node::PrimObject { name, tag, items } => {
for alt in alternatives {
if alt.tag.map(|t| t == tag).unwrap_or(true) {
let mut inner_state = State {
values: self.values.new_scope(None),
symbol_table_handle: self.symbol_table_handle.clone(),
};
for (bound_var, val) in alt.bound_vars.iter().zip(items.iter()) {
if let Some(bv) = bound_var.as_ref() {
inner_state.values.insert(bv.clone(), ValueEntry::Binding { constant: true, val: val.clone() });
}
}
return inner_state.block(alt.item)
}
}
return Err(format!("No matches found"));
},
Node::PrimTuple { .. } => Err(format!("Tuples don't work")),
Node::Expr(e) => Err(format!("Exprs don't work {:?}", e))
},
UnimplementedSigilValue => Err(format!("Sigil value eval not implemented"))
}
}
}
fn apply_data_constructor(&mut self, type_name: Rc<String>, name: Rc<String>, tag: usize, arity: usize, args: Vec<Expr>) -> EvalResult<Node> {
if arity != args.len() {
return Err(format!("Data constructor {} requires {} args", name, arity));
}
let evaled_args = args.into_iter().map(|expr| self.expression(Node::Expr(expr))).collect::<Result<Vec<Node>,_>>()?;
//let evaled_args = vec![];
Ok(Node::PrimObject {
name: name.clone(),
items: evaled_args,
tag
})
}
fn apply_function(&mut self, f: Func, args: Vec<Expr>) -> EvalResult<Node> {
match f {
Func::BuiltIn(sigil) => Ok(Node::Expr(self.apply_builtin(sigil, args)?)),
Func::UserDefined { params, body, name } => {
if params.len() != args.len() {
return Err(format!("calling a {}-argument function with {} args", params.len(), args.len()))
}
let mut func_state = State {
values: self.values.new_scope(name.map(|n| format!("{}", n))),
symbol_table_handle: self.symbol_table_handle.clone(),
};
for (param, val) in params.into_iter().zip(args.into_iter()) {
let val = func_state.expression(Node::Expr(val))?;
func_state.values.insert(param, ValueEntry::Binding { constant: true, val });
}
// TODO figure out function return semantics
func_state.block(body)
}
}
}
fn apply_builtin(&mut self, name: Rc<String>, args: Vec<Expr>) -> EvalResult<Expr> {
use self::Expr::*;
use self::Lit::*;
let evaled_args: Result<Vec<Expr>, String> = args.into_iter().map(|arg| {
match self.expression(Node::Expr(arg)) {
Ok(Node::Expr(e)) => Ok(e),
Ok(Node::PrimTuple { .. }) => Err(format!("Trying to apply a builtin to a tuple")),
Ok(Node::PrimObject { .. }) => Err(format!("Trying to apply a builtin to a primitive object")),
Err(e) => Err(e)
}
}).collect();
let evaled_args = evaled_args?;
Ok(match (name.as_str(), evaled_args.as_slice()) {
/* binops */
("+", &[Lit(Nat(l)), Lit(Nat(r))]) => Lit(Nat(l + r)),
("++", &[Lit(StringLit(ref s1)), Lit(StringLit(ref s2))]) => Lit(StringLit(Rc::new(format!("{}{}", s1, s2)))),
("-", &[Lit(Nat(l)), Lit(Nat(r))]) => Lit(Nat(l - r)),
("*", &[Lit(Nat(l)), Lit(Nat(r))]) => Lit(Nat(l * r)),
("/", &[Lit(Nat(l)), Lit(Nat(r))]) => Lit(Float((l as f64)/ (r as f64))),
("//", &[Lit(Nat(l)), Lit(Nat(r))]) => if r == 0 {
return Err(format!("divide by zero"));
} else {
Lit(Nat(l / r))
},
("%", &[Lit(Nat(l)), Lit(Nat(r))]) => Lit(Nat(l % r)),
("^", &[Lit(Nat(l)), Lit(Nat(r))]) => Lit(Nat(l ^ r)),
("&", &[Lit(Nat(l)), Lit(Nat(r))]) => Lit(Nat(l & r)),
("|", &[Lit(Nat(l)), Lit(Nat(r))]) => Lit(Nat(l | r)),
/* comparisons */
("==", &[Lit(Nat(l)), Lit(Nat(r))]) => Lit(Bool(l == r)),
("==", &[Lit(Int(l)), Lit(Int(r))]) => Lit(Bool(l == r)),
("==", &[Lit(Float(l)), Lit(Float(r))]) => Lit(Bool(l == r)),
("==", &[Lit(Bool(l)), Lit(Bool(r))]) => Lit(Bool(l == r)),
("==", &[Lit(StringLit(ref l)), Lit(StringLit(ref r))]) => Lit(Bool(l == r)),
("<", &[Lit(Nat(l)), Lit(Nat(r))]) => Lit(Bool(l < r)),
("<", &[Lit(Int(l)), Lit(Int(r))]) => Lit(Bool(l < r)),
("<", &[Lit(Float(l)), Lit(Float(r))]) => Lit(Bool(l < r)),
("<=", &[Lit(Nat(l)), Lit(Nat(r))]) => Lit(Bool(l <= r)),
("<=", &[Lit(Int(l)), Lit(Int(r))]) => Lit(Bool(l <= r)),
("<=", &[Lit(Float(l)), Lit(Float(r))]) => Lit(Bool(l <= r)),
(">", &[Lit(Nat(l)), Lit(Nat(r))]) => Lit(Bool(l > r)),
(">", &[Lit(Int(l)), Lit(Int(r))]) => Lit(Bool(l > r)),
(">", &[Lit(Float(l)), Lit(Float(r))]) => Lit(Bool(l > r)),
(">=", &[Lit(Nat(l)), Lit(Nat(r))]) => Lit(Bool(l >= r)),
(">=", &[Lit(Int(l)), Lit(Int(r))]) => Lit(Bool(l >= r)),
(">=", &[Lit(Float(l)), Lit(Float(r))]) => Lit(Bool(l >= r)),
/* prefix ops */
("!", &[Lit(Bool(true))]) => Lit(Bool(false)),
("!", &[Lit(Bool(false))]) => Lit(Bool(true)),
("-", &[Lit(Nat(n))]) => Lit(Int(-1*(n as i64))),
("-", &[Lit(Int(n))]) => Lit(Int(-1*(n as i64))),
("+", &[Lit(Int(n))]) => Lit(Int(n)),
("+", &[Lit(Nat(n))]) => Lit(Nat(n)),
/* builtin functions */
("print", &[ref anything]) => {
print!("{}", anything.to_repl());
Expr::Unit
},
("println", &[ref anything]) => {
println!("{}", anything.to_repl());
Expr::Unit
},
("getline", &[]) => {
let mut buf = String::new();
io::stdin().read_line(&mut buf).expect("Error readling line in 'getline'");
Lit(StringLit(Rc::new(buf.trim().to_string())))
},
(x, args) => return Err(format!("bad or unimplemented builtin {:?} | {:?}", x, args)),
})
}
fn conditional(&mut self, cond: Expr, then_clause: Vec<Stmt>, else_clause: Vec<Stmt>) -> EvalResult<Node> {
let cond = self.expression(Node::Expr(cond))?;
Ok(match cond {
Node::Expr(Expr::Lit(Lit::Bool(true))) => self.block(then_clause)?,
Node::Expr(Expr::Lit(Lit::Bool(false))) => self.block(else_clause)?,
_ => return Err(format!("Conditional with non-boolean condition"))
})
}
fn value(&mut self, name: Rc<String>) -> EvalResult<Node> {
use self::ValueEntry::*;
use self::Func::*;
//TODO add a layer of indirection here to talk to the symbol table first, and only then look up
//in the values table
let symbol_table = self.symbol_table_handle.borrow();
let value = symbol_table.lookup_by_name(&name);
Ok(match value {
Some(Symbol { name, spec }) => match spec {
//TODO I'll need this type_name later to do a table lookup
SymbolSpec::DataConstructor { type_name: _type_name, type_args, .. } => {
if type_args.len() == 0 {
Node::PrimObject { name: name.clone(), tag: 0, items: vec![] }
} else {
return Err(format!("This data constructor thing not done"))
}
},
SymbolSpec::Func(_) => match self.values.lookup(&name) {
Some(Binding { val: Node::Expr(Expr::Func(UserDefined { name, params, body })), .. }) => {
Node::Expr(Expr::Func(UserDefined { name: name.clone(), params: params.clone(), body: body.clone() }))
},
_ => unreachable!(),
},
},
/* see if it's an ordinary variable TODO make variables go in symbol table */
None => match self.values.lookup(&name) {
Some(Binding { val, .. }) => val.clone(),
None => return Err(format!("Couldn't find value {}", name)),
}
})
}
}
#[cfg(test)]
mod eval_tests {
use std::cell::RefCell;
use std::rc::Rc;
use symbol_table::SymbolTable;
use tokenizing::tokenize;
use parsing::parse;
use eval::State;
macro_rules! all_output {
($string:expr) => {
{
let symbol_table = Rc::new(RefCell::new(SymbolTable::new()));
let mut state = State::new(symbol_table);
let ast = parse(tokenize($string)).0.unwrap();
state.symbol_table_handle.borrow_mut().add_top_level_symbols(&ast).unwrap();
let reduced = ast.reduce(&state.symbol_table_handle.borrow());
let all_output = state.evaluate(reduced, true);
all_output
}
}
}
macro_rules! fresh_env {
($string:expr, $correct:expr) => {
{
let all_output = all_output!($string);
let ref output = all_output.last().unwrap();
assert_eq!(**output, Ok($correct.to_string()));
}
}
}
#[test]
fn test_basic_eval() {
fresh_env!("1 + 2", "3");
fresh_env!("let mut a = 1; a = 2", "Unit");
fresh_env!("let mut a = 1; a = 2; a", "2");
fresh_env!(r#"("a", 1 + 2)"#, r#"("a", 3)"#);
}
#[test]
fn function_eval() {
fresh_env!("fn oi(x) { x + 1 }; oi(4)", "5");
fresh_env!("fn oi(x) { x + 1 }; oi(1+2)", "4");
}
#[test]
fn scopes() {
let scope_ok = r#"
let a = 20
fn haha() {
let a = 10
a
}
haha()
"#;
fresh_env!(scope_ok, "10");
let scope_ok = r#"
let a = 20
fn haha() {
let a = 10
a
}
a
"#;
fresh_env!(scope_ok, "20");
}
#[test]
fn basic_patterns() {
let source = r#"
type Option<T> = Some(T) | None
let x = Some(9); if x is Some(q) then { q } else { 0 }"#;
fresh_env!(source, "9");
let source = r#"
type Option<T> = Some(T) | None
let x = None; if x is Some(q) then { q } else { 0 }"#;
fresh_env!(source, "0");
}
}

148
schala-lang/src/lib.rs Normal file
View File

@@ -0,0 +1,148 @@
#![feature(trace_macros)]
#![feature(slice_patterns, box_patterns, box_syntax)]
extern crate itertools;
#[macro_use]
extern crate lazy_static;
#[macro_use]
extern crate maplit;
#[macro_use]
extern crate schala_repl;
#[macro_use]
extern crate schala_codegen;
use std::cell::RefCell;
use std::rc::Rc;
use itertools::Itertools;
use schala_repl::{ProgrammingLanguageInterface, EvalOptions, TraceArtifact, UnfinishedComputation, FinishedComputation};
macro_rules! bx {
($e:expr) => { Box::new($e) }
}
mod util;
mod builtin;
mod tokenizing;
mod ast;
mod parsing;
mod symbol_table;
mod typechecking;
mod reduced_ast;
mod eval;
//trace_macros!(true);
#[derive(ProgrammingLanguageInterface)]
#[LanguageName = "Schala"]
#[SourceFileExtension = "schala"]
#[PipelineSteps(tokenizing, parsing(compact,expanded,trace), symbol_table, typechecking, ast_reducing, eval)]
pub struct Schala {
state: eval::State<'static>,
symbol_table: Rc<RefCell<symbol_table::SymbolTable>>,
type_context: typechecking::TypeContext<'static>,
}
impl Schala {
fn new_blank_env() -> Schala {
let symbols = Rc::new(RefCell::new(symbol_table::SymbolTable::new()));
Schala {
symbol_table: symbols.clone(),
type_context: typechecking::TypeContext::new(symbols.clone()),
state: eval::State::new(symbols),
}
}
pub fn new() -> Schala {
let prelude = r#"
type Option<T> = Some(T) | None
type Color = Red | Green | Blue
type Ord = LT | EQ | GT
"#;
let mut s = Schala::new_blank_env();
s.execute_pipeline(prelude, &EvalOptions::default());
s
}
}
fn tokenizing(_handle: &mut Schala, input: &str, comp: Option<&mut UnfinishedComputation>) -> Result<Vec<tokenizing::Token>, String> {
let tokens = tokenizing::tokenize(input);
comp.map(|comp| {
let token_string = tokens.iter().map(|t| format!("{:?}<L:{},C:{}>", t.token_type, t.offset.0, t.offset.1)).join(", ");
comp.add_artifact(TraceArtifact::new("tokens", token_string));
});
let errors: Vec<String> = tokens.iter().filter_map(|t| t.get_error()).collect();
if errors.len() == 0 {
Ok(tokens)
} else {
Err(format!("{:?}", errors))
}
}
fn parsing(_handle: &mut Schala, input: Vec<tokenizing::Token>, comp: Option<&mut UnfinishedComputation>) -> Result<ast::AST, String> {
let (ast, trace) = parsing::parse(input);
comp.map(|comp| {
//TODO need to control which of these debug stages get added
let opt = comp.cur_debug_options.get(0).map(|s| s.clone());
match opt {
None => comp.add_artifact(TraceArtifact::new("ast", format!("{:?}", ast))),
Some(ref s) if s == "compact" => comp.add_artifact(TraceArtifact::new("ast", format!("{:?}", ast))),
Some(ref s) if s == "expanded" => comp.add_artifact(TraceArtifact::new("ast", format!("{:#?}", ast))),
Some(ref s) if s == "trace" => comp.add_artifact(TraceArtifact::new_parse_trace(trace)),
Some(ref x) => println!("Bad parsing debug option: {}", x),
};
});
ast.map_err(|err| err.msg)
}
fn symbol_table(handle: &mut Schala, input: ast::AST, comp: Option<&mut UnfinishedComputation>) -> Result<ast::AST, String> {
let add = handle.symbol_table.borrow_mut().add_top_level_symbols(&input);
match add {
Ok(()) => {
let artifact = TraceArtifact::new("symbol_table", handle.symbol_table.borrow().debug_symbol_table());
comp.map(|comp| comp.add_artifact(artifact));
Ok(input)
},
Err(msg) => Err(msg)
}
}
fn typechecking(handle: &mut Schala, input: ast::AST, comp: Option<&mut UnfinishedComputation>) -> Result<ast::AST, String> {
match handle.type_context.type_check_ast(&input) {
Ok(ty) => {
comp.map(|c| {
c.add_artifact(TraceArtifact::new("type_table", format!("{}", handle.type_context.debug_types())));
c.add_artifact(TraceArtifact::new("type_check", format!("{:?}", ty)));
});
Ok(input)
},
Err(msg) => {
comp.map(|comp| {
comp.add_artifact(TraceArtifact::new("type_table", format!("{}", handle.type_context.debug_types())));
comp.add_artifact(TraceArtifact::new("type_check", format!("Type error: {:?}", msg)));
});
Ok(input)
}
}
}
fn ast_reducing(handle: &mut Schala, input: ast::AST, comp: Option<&mut UnfinishedComputation>) -> Result<reduced_ast::ReducedAST, String> {
let ref symbol_table = handle.symbol_table.borrow();
let output = input.reduce(symbol_table);
comp.map(|comp| comp.add_artifact(TraceArtifact::new("ast_reducing", format!("{:?}", output))));
Ok(output)
}
fn eval(handle: &mut Schala, input: reduced_ast::ReducedAST, comp: Option<&mut UnfinishedComputation>) -> Result<String, String> {
comp.map(|comp| comp.add_artifact(TraceArtifact::new("value_state", handle.state.debug_print())));
let evaluation_outputs = handle.state.evaluate(input, true);
let text_output: Result<Vec<String>, String> = evaluation_outputs
.into_iter()
.collect();
let eval_output: Result<String, String> = text_output
.map(|v| { v.into_iter().intersperse(format!("\n")).collect() });
eval_output
}

1539
schala-lang/src/parsing.rs Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,282 @@
use std::rc::Rc;
use ast::{AST, Statement, Expression, Declaration, Discriminator, IfExpressionBody, Pattern, PatternLiteral, Guard, HalfExpr};
use symbol_table::{Symbol, SymbolSpec, SymbolTable};
use builtin::{BinOp, PrefixOp};
#[derive(Debug)]
pub struct ReducedAST(pub Vec<Stmt>);
#[derive(Debug, Clone)]
pub enum Stmt {
PreBinding {
name: Rc<String>,
func: Func,
},
Binding {
name: Rc<String>,
constant: bool,
expr: Expr,
},
Expr(Expr),
Noop,
}
#[derive(Debug, Clone)]
pub enum Expr {
Unit,
Lit(Lit),
Tuple(Vec<Expr>),
Func(Func),
Val(Rc<String>),
Constructor {
type_name: Rc<String>,
name: Rc<String>,
tag: usize,
arity: usize,
},
Call {
f: Box<Expr>,
args: Vec<Expr>,
},
Assign {
val: Box<Expr>,
expr: Box<Expr>,
},
Conditional {
cond: Box<Expr>,
then_clause: Vec<Stmt>,
else_clause: Vec<Stmt>,
},
CaseMatch {
cond: Box<Expr>,
alternatives: Vec<Alternative>
},
UnimplementedSigilValue
}
#[derive(Debug, Clone)]
pub struct Alternative {
pub tag: Option<usize>,
pub guards: Vec<Expr>,
pub bound_vars: Vec<Option<Rc<String>>>, //order here is iconic to order in a tuple-like type, None is equivalent to ignored
pub item: Vec<Stmt>,
}
#[derive(Debug, Clone)]
pub enum Lit {
Nat(u64),
Int(i64),
Float(f64),
Bool(bool),
StringLit(Rc<String>),
}
#[derive(Debug, Clone)]
pub enum Func {
BuiltIn(Rc<String>),
UserDefined {
name: Option<Rc<String>>,
params: Vec<Rc<String>>,
body: Vec<Stmt>,
}
}
impl AST {
pub fn reduce(&self, symbol_table: &SymbolTable) -> ReducedAST {
let mut output = vec![];
for statement in self.0.iter() {
output.push(statement.reduce(symbol_table));
}
ReducedAST(output)
}
}
impl Statement {
fn reduce(&self, symbol_table: &SymbolTable) -> Stmt {
use ast::Statement::*;
match self {
ExpressionStatement(expr) => Stmt::Expr(expr.reduce(symbol_table)),
Declaration(decl) => decl.reduce(symbol_table),
}
}
}
impl Expression {
fn reduce(&self, symbol_table: &SymbolTable) -> Expr {
use ast::ExpressionType::*;
let ref input = self.0;
match input {
NatLiteral(n) => Expr::Lit(Lit::Nat(*n)),
FloatLiteral(f) => Expr::Lit(Lit::Float(*f)),
StringLiteral(s) => Expr::Lit(Lit::StringLit(s.clone())),
BoolLiteral(b) => Expr::Lit(Lit::Bool(*b)),
BinExp(binop, lhs, rhs) => binop.reduce(symbol_table, lhs, rhs),
PrefixExp(op, arg) => op.reduce(symbol_table, arg),
Value(name) => match symbol_table.lookup_by_name(name) {
Some(Symbol { spec: SymbolSpec::DataConstructor { index, type_args, type_name}, .. }) => Expr::Constructor {
type_name: type_name.clone(),
name: name.clone(),
tag: index.clone(),
arity: type_args.len(),
},
_ => Expr::Val(name.clone()),
},
Call { f, arguments } => Expr::Call {
f: Box::new(f.reduce(symbol_table)),
args: arguments.iter().map(|arg| arg.reduce(symbol_table)).collect(),
},
TupleLiteral(exprs) => Expr::Tuple(exprs.iter().map(|e| e.reduce(symbol_table)).collect()),
IfExpression { discriminator, body } => reduce_if_expression(discriminator, body, symbol_table),
_ => Expr::UnimplementedSigilValue,
}
}
}
fn reduce_if_expression(discriminator: &Discriminator, body: &IfExpressionBody, symbol_table: &SymbolTable) -> Expr {
let cond = Box::new(match *discriminator {
Discriminator::Simple(ref expr) => expr.reduce(symbol_table),
Discriminator::BinOp(ref expr, ref binop) => {
panic!()
}
});
match *body {
IfExpressionBody::SimpleConditional(ref then_clause, ref else_clause) => {
let then_clause = then_clause.iter().map(|expr| expr.reduce(symbol_table)).collect();
let else_clause = match else_clause {
None => vec![],
Some(stmts) => stmts.iter().map(|expr| expr.reduce(symbol_table)).collect(),
};
Expr::Conditional { cond, then_clause, else_clause }
},
IfExpressionBody::SimplePatternMatch(ref pat, ref then_clause, ref else_clause) => {
let then_clause = then_clause.iter().map(|expr| expr.reduce(symbol_table)).collect();
let else_clause = match else_clause {
None => vec![],
Some(stmts) => stmts.iter().map(|expr| expr.reduce(symbol_table)).collect(),
};
let alternatives = vec![
pat.to_alternative(then_clause, symbol_table),
Alternative {
tag: None,
guards: vec![],
bound_vars: vec![],
item: else_clause,
},
];
Expr::CaseMatch {
cond,
alternatives,
}
},
IfExpressionBody::GuardList(ref guard_arms) => {
let alternatives = guard_arms.iter().map(|arm| match arm.guard {
Guard::Pat(ref p) => {
let item = arm.body.iter().map(|expr| expr.reduce(symbol_table)).collect();
p.to_alternative(item, symbol_table)
},
Guard::HalfExpr(HalfExpr { op: _, expr: _ }) => {
unimplemented!()
}
}).collect();
Expr::CaseMatch { cond, alternatives }
}
}
}
impl Pattern {
fn to_alternative(&self, item: Vec<Stmt>, symbol_table: &SymbolTable) -> Alternative {
use self::Pattern::*;
match self {
TupleStruct(name, subpatterns) => {
let symbol = symbol_table.values.get(name).expect(&format!("Symbol {} not found", name));
let tag = match symbol.spec {
SymbolSpec::DataConstructor { index, .. } => index.clone(),
_ => panic!("Bad symbol"),
};
/*
let guards = patterns.iter().map(|p| match p {
});
*/
let guards = unimplemented!();
let bound_vars = subpatterns.iter().map(|p| match p {
Literal(PatternLiteral::VarPattern(var)) => Some(var.clone()),
Ignored => None,
_ => None,
}).collect();
Alternative {
tag: Some(tag),
guards,
bound_vars,
item,
}
},
TuplePattern(_items) => {
unimplemented!()
},
Record(_name, _pairs) => {
unimplemented!()
},
Ignored => unimplemented!(),
/* "a constant appearing in a pattern can easily be eliminated by replacing it with a variable
* and adding a guard to the equation instead" - Implementation of Functional Programming
* Languages Simon Peyton-Jones, p. 58 */
Literal(lit) => match lit {
PatternLiteral::NumPattern { neg, num } => unimplemented!(),
PatternLiteral::StringPattern(_s) => unimplemented!(),
PatternLiteral::BoolPattern(_b) => unimplemented!(),
PatternLiteral::VarPattern(_var) => unimplemented!(),
},
}
}
}
impl Declaration {
fn reduce(&self, symbol_table: &SymbolTable) -> Stmt {
use self::Declaration::*;
use ::ast::Signature;
match self {
Binding {name, constant, expr } => Stmt::Binding { name: name.clone(), constant: *constant, expr: expr.reduce(symbol_table) },
FuncDecl(Signature { name, params, .. }, statements) => Stmt::PreBinding {
name: name.clone(),
func: Func::UserDefined {
name: Some(name.clone()),
params: params.iter().map(|param| param.0.clone()).collect(),
body: statements.iter().map(|stmt| stmt.reduce(symbol_table)).collect(),
}
},
TypeDecl { .. } => Stmt::Noop,
TypeAlias(_, _) => Stmt::Noop,
Interface { .. } => Stmt::Noop,
Impl { .. } => Stmt::Expr(Expr::UnimplementedSigilValue),
_ => Stmt::Expr(Expr::UnimplementedSigilValue)
}
}
}
impl BinOp {
fn reduce(&self, symbol_table: &SymbolTable, lhs: &Box<Expression>, rhs: &Box<Expression>) -> Expr {
if **self.sigil() == "=" {
Expr::Assign {
val: Box::new(lhs.reduce(symbol_table)),
expr: Box::new(rhs.reduce(symbol_table)),
}
} else {
let f = Box::new(Expr::Func(Func::BuiltIn(self.sigil().clone())));
Expr::Call { f, args: vec![lhs.reduce(symbol_table), rhs.reduce(symbol_table)]}
}
}
}
impl PrefixOp {
fn reduce(&self, symbol_table: &SymbolTable, arg: &Box<Expression>) -> Expr {
let f = Box::new(Expr::Func(Func::BuiltIn(self.sigil().clone())));
Expr::Call { f, args: vec![arg.reduce(symbol_table)]}
}
}

View File

@@ -0,0 +1,131 @@
use std::collections::HashMap;
use std::rc::Rc;
use std::fmt;
use std::fmt::Write;
use ast;
use typechecking::TypeName;
//cf. p. 150 or so of Language Implementation Patterns
pub struct SymbolTable {
pub values: HashMap<Rc<String>, Symbol> //TODO this will eventually have real type information
}
//TODO add various types of lookups here, maybe multiple hash tables internally? also make values
//non-public
impl SymbolTable {
pub fn new() -> SymbolTable {
SymbolTable { values: HashMap::new() }
}
pub fn lookup_by_name(&self, name: &Rc<String>) -> Option<&Symbol> {
self.values.get(name)
}
}
#[derive(Debug)]
pub struct Symbol {
pub name: Rc<String>,
pub spec: SymbolSpec,
}
impl fmt::Display for Symbol {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
write!(f, "<Name: {}, Spec: {}>", self.name, self.spec)
}
}
#[derive(Debug)]
pub enum SymbolSpec {
Func(Vec<TypeName>),
DataConstructor {
index: usize,
type_name: Rc<String>,
type_args: Vec<Rc<String>>,
},
}
impl fmt::Display for SymbolSpec {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
use self::SymbolSpec::*;
match self {
Func(type_names) => write!(f, "Func({:?})", type_names),
DataConstructor { index, type_name, type_args } => write!(f, "DataConstructor({})({:?} -> {})", index, type_args, type_name),
}
}
}
impl SymbolTable {
/* note: this adds names for *forward reference* but doesn't actually create any types. solve that problem
* later */
pub fn add_top_level_symbols(&mut self, ast: &ast::AST) -> Result<(), String> {
use self::ast::{Statement, TypeName, Variant, TypeSingletonName, TypeBody};
use self::ast::Declaration::*;
for statement in ast.0.iter() {
if let Statement::Declaration(decl) = statement {
match decl {
FuncSig(signature) | FuncDecl(signature, _) => {
let mut ch: char = 'a';
let mut types = vec![];
for param in signature.params.iter() {
match param {
(_, Some(_ty)) => {
//TODO eventually handle this case different
types.push(Rc::new(format!("{}", ch)));
ch = ((ch as u8) + 1) as char;
},
(_, None) => {
types.push(Rc::new(format!("{}", ch)));
ch = ((ch as u8) + 1) as char;
}
}
}
let spec = SymbolSpec::Func(types);
self.values.insert(
signature.name.clone(),
Symbol { name: signature.name.clone(), spec }
);
},
//TODO figure out why _params isn't being used here
TypeDecl { name: TypeSingletonName { name, params: _params}, body: TypeBody(variants), mutable: _mutable, } => {
for (index, var) in variants.iter().enumerate() {
match var {
Variant::UnitStruct(variant_name) => {
let spec = SymbolSpec::DataConstructor {
index,
type_name: name.clone(),
type_args: vec![],
};
self.values.insert(variant_name.clone(), Symbol { name: variant_name.clone(), spec });
},
Variant::TupleStruct(variant_name, tuple_members) => {
let type_args = tuple_members.iter().map(|type_name| match type_name {
TypeName::Singleton(TypeSingletonName { name, ..}) => name.clone(),
TypeName::Tuple(_) => unimplemented!(),
}).collect();
let spec = SymbolSpec::DataConstructor {
index,
type_name: name.clone(),
type_args
};
let symbol = Symbol { name: variant_name.clone(), spec };
self.values.insert(variant_name.clone(), symbol);
},
e => return Err(format!("{:?} not supported in typing yet", e)),
}
}
},
_ => ()
}
}
}
Ok(())
}
pub fn debug_symbol_table(&self) -> String {
let mut output = format!("Symbol table\n");
for (name, sym) in &self.values {
write!(output, "{} -> {}\n", name, sym).unwrap();
}
output
}
}

View File

@@ -5,17 +5,17 @@ use std::iter::{Iterator, Peekable};
use std::fmt;
#[derive(Debug, PartialEq, Clone)]
pub enum TokenKind {
pub enum TokenType {
Newline, Semicolon,
LParen, RParen,
LSquareBracket, RSquareBracket,
LAngleBracket, RAngleBracket,
LCurlyBrace, RCurlyBrace,
Pipe, Backslash,
Pipe,
Comma, Period, Colon, Underscore,
Slash, Equals,
Slash,
Operator(Rc<String>),
DigitGroup(Rc<String>), HexLiteral(Rc<String>), BinNumberSigil,
@@ -27,9 +27,9 @@ pub enum TokenKind {
Error(String),
}
use self::TokenKind::*;
use self::TokenType::*;
impl fmt::Display for TokenKind {
impl fmt::Display for TokenType {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
match self {
&Operator(ref s) => write!(f, "Operator({})", **s),
@@ -87,24 +87,19 @@ lazy_static! {
#[derive(Debug, Clone)]
pub struct Token {
pub kind: TokenKind,
pub line_num: usize,
pub char_num: usize
pub token_type: TokenType,
pub offset: (usize, usize),
}
impl Token {
pub fn get_error(&self) -> Option<String> {
match self.kind {
TokenKind::Error(ref s) => Some(s.clone()),
match self.token_type {
TokenType::Error(ref s) => Some(s.clone()),
_ => None,
}
}
pub fn to_string_with_metadata(&self) -> String {
format!("{}(L:{},c:{})", self.kind, self.line_num, self.char_num)
}
pub fn get_kind(&self) -> TokenKind {
self.kind.clone()
format!("{}(L:{},c:{})", self.token_type, self.offset.0, self.offset.1)
}
}
@@ -118,15 +113,15 @@ type CharData = (usize, usize, char);
pub fn tokenize(input: &str) -> Vec<Token> {
let mut tokens: Vec<Token> = Vec::new();
let mut input = input.lines().enumerate()
let mut input = input.lines().enumerate()
.intersperse((0, "\n"))
.flat_map(|(line_idx, ref line)| {
line.chars().enumerate().map(move |(ch_idx, ch)| (line_idx, ch_idx, ch))
})
.peekable();
while let Some((line_num, char_num, c)) = input.next() {
let cur_tok_kind = match c {
while let Some((line_idx, ch_idx, c)) = input.next() {
let cur_tok_type = match c {
'/' => match input.peek().map(|t| t.2) {
Some('/') => {
while let Some((_, _, c)) = input.next() {
@@ -162,18 +157,17 @@ pub fn tokenize(input: &str) -> Vec<Token> {
'{' => LCurlyBrace, '}' => RCurlyBrace,
'[' => LSquareBracket, ']' => RSquareBracket,
'"' => handle_quote(&mut input),
'\\' => Backslash,
c if c.is_digit(10) => handle_digit(c, &mut input),
c if c.is_alphabetic() || c == '_' => handle_alphabetic(c, &mut input),
c if c.is_alphabetic() || c == '_' => handle_alphabetic(c, &mut input), //TODO I'll probably have to rewrite this if I care about types being uppercase, also type parameterization
c if is_operator(&c) => handle_operator(c, &mut input),
unknown => Error(format!("Unexpected character: {}", unknown)),
};
tokens.push(Token { kind: cur_tok_kind, line_num, char_num });
tokens.push(Token { token_type: cur_tok_type, offset: (line_idx, ch_idx) });
}
tokens
}
fn handle_digit(c: char, input: &mut Peekable<impl Iterator<Item=CharData>>) -> TokenKind {
fn handle_digit(c: char, input: &mut Peekable<impl Iterator<Item=CharData>>) -> TokenType {
if c == '0' && input.peek().map_or(false, |&(_, _, c)| { c == 'x' }) {
input.next();
let rest: String = input.peeking_take_while(|&(_, _, ref c)| c.is_digit(16) || *c == '_').map(|(_, _, c)| { c }).collect();
@@ -188,7 +182,7 @@ fn handle_digit(c: char, input: &mut Peekable<impl Iterator<Item=CharData>>) ->
}
}
fn handle_quote(input: &mut Peekable<impl Iterator<Item=CharData>>) -> TokenKind {
fn handle_quote(input: &mut Peekable<impl Iterator<Item=CharData>>) -> TokenType {
let mut buf = String::new();
loop {
match input.next().map(|(_, _, c)| { c }) {
@@ -207,22 +201,22 @@ fn handle_quote(input: &mut Peekable<impl Iterator<Item=CharData>>) -> TokenKind
}
},
Some(c) => buf.push(c),
None => return TokenKind::Error(format!("Unclosed string")),
None => return TokenType::Error(format!("Unclosed string")),
}
}
TokenKind::StrLiteral(Rc::new(buf))
TokenType::StrLiteral(Rc::new(buf))
}
fn handle_alphabetic(c: char, input: &mut Peekable<impl Iterator<Item=CharData>>) -> TokenKind {
fn handle_alphabetic(c: char, input: &mut Peekable<impl Iterator<Item=CharData>>) -> TokenType {
let mut buf = String::new();
buf.push(c);
if c == '_' && input.peek().map(|&(_, _, c)| { !c.is_alphabetic() }).unwrap_or(true) {
return TokenKind::Underscore
return TokenType::Underscore
}
loop {
match input.peek().map(|&(_, _, c)| { c }) {
Some(c) if c.is_alphanumeric() || c == '_' => {
Some(c) if c.is_alphanumeric() => {
input.next();
buf.push(c);
},
@@ -231,14 +225,14 @@ fn handle_alphabetic(c: char, input: &mut Peekable<impl Iterator<Item=CharData>>
}
match KEYWORDS.get(buf.as_str()) {
Some(kw) => TokenKind::Keyword(*kw),
None => TokenKind::Identifier(Rc::new(buf)),
Some(kw) => TokenType::Keyword(*kw),
None => TokenType::Identifier(Rc::new(buf)),
}
}
fn handle_operator(c: char, input: &mut Peekable<impl Iterator<Item=CharData>>) -> TokenKind {
fn handle_operator(c: char, input: &mut Peekable<impl Iterator<Item=CharData>>) -> TokenType {
match c {
'<' | '>' | '|' | '.' | '=' => {
'<' | '>' | '|' | '.' => {
let ref next = input.peek().map(|&(_, _, c)| { c });
if !next.map(|n| { is_operator(&n) }).unwrap_or(false) {
return match c {
@@ -246,7 +240,6 @@ fn handle_operator(c: char, input: &mut Peekable<impl Iterator<Item=CharData>>)
'>' => RAngleBracket,
'|' => Pipe,
'.' => Period,
'=' => Equals,
_ => unreachable!(),
}
}
@@ -282,7 +275,7 @@ fn handle_operator(c: char, input: &mut Peekable<impl Iterator<Item=CharData>>)
}
}
}
TokenKind::Operator(Rc::new(buf))
TokenType::Operator(Rc::new(buf))
}
#[cfg(test)]
@@ -297,29 +290,26 @@ mod schala_tokenizer_tests {
#[test]
fn tokens() {
let a = tokenize("let a: A<B> = c ++ d");
let token_kinds: Vec<TokenKind> = a.into_iter().map(move |t| t.kind).collect();
assert_eq!(token_kinds, vec![Keyword(Let), ident!("a"), Colon, ident!("A"),
LAngleBracket, ident!("B"), RAngleBracket, Equals, ident!("c"), op!("++"), ident!("d")]);
let token_types: Vec<TokenType> = a.into_iter().map(move |t| t.token_type).collect();
assert_eq!(token_types, vec![Keyword(Let), ident!("a"), Colon, ident!("A"),
LAngleBracket, ident!("B"), RAngleBracket, op!("="), ident!("c"), op!("++"), ident!("d")]);
}
#[test]
fn underscores() {
let token_kinds: Vec<TokenKind> = tokenize("4_8").into_iter().map(move |t| t.kind).collect();
assert_eq!(token_kinds, vec![digit!("4"), Underscore, digit!("8")]);
let token_kinds2: Vec<TokenKind> = tokenize("aba_yo").into_iter().map(move |t| t.kind).collect();
assert_eq!(token_kinds2, vec![ident!("aba_yo")]);
let token_types: Vec<TokenType> = tokenize("4_8").into_iter().map(move |t| t.token_type).collect();
assert_eq!(token_types, vec![digit!("4"), Underscore, digit!("8")]);
}
#[test]
fn comments() {
let token_kinds: Vec<TokenKind> = tokenize("1 + /* hella /* bro */ */ 2").into_iter().map(move |t| t.kind).collect();
assert_eq!(token_kinds, vec![digit!("1"), op!("+"), digit!("2")]);
let token_types: Vec<TokenType> = tokenize("1 + /* hella /* bro */ */ 2").into_iter().map(move |t| t.token_type).collect();
assert_eq!(token_types, vec![digit!("1"), op!("+"), digit!("2")]);
}
#[test]
fn backtick_operators() {
let token_kinds: Vec<TokenKind> = tokenize("1 `plus` 2").into_iter().map(move |t| t.kind).collect();
assert_eq!(token_kinds, vec![digit!("1"), op!("plus"), digit!("2")]);
let token_types: Vec<TokenType> = tokenize("1 `plus` 2").into_iter().map(move |t| t.token_type).collect();
assert_eq!(token_types, vec![digit!("1"), op!("plus"), digit!("2")]);
}
}

View File

@@ -0,0 +1,493 @@
use std::cell::RefCell;
use std::rc::Rc;
use std::collections::HashMap;
use std::fmt;
use std::fmt::Write;
/*
use std::collections::hash_set::Union;
use std::iter::Iterator;
use itertools::Itertools;
*/
use ast;
use util::ScopeStack;
use symbol_table::{SymbolSpec, SymbolTable};
pub type TypeName = Rc<String>;
type TypeResult<T> = Result<T, String>;
#[derive(Debug, PartialEq, Clone)]
enum Type {
Const(TConst),
Var(TypeName),
Func(Vec<Type>),
}
#[derive(Debug, PartialEq, Clone)]
enum TConst {
Unit,
Nat,
StringT,
//Custom(String)
}
#[derive(Debug, PartialEq, Clone)]
struct Scheme {
names: Vec<TypeName>,
ty: Type,
}
impl fmt::Display for Scheme {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
write!(f, "∀{:?} . {:?}", self.names, self.ty)
}
}
#[derive(Debug, PartialEq, Clone)]
struct Substitution(HashMap<TypeName, Type>);
impl Substitution {
fn empty() -> Substitution {
Substitution(HashMap::new())
}
}
#[derive(Debug, PartialEq, Clone)]
struct TypeEnv(HashMap<TypeName, Scheme>);
impl TypeEnv {
fn default() -> TypeEnv {
TypeEnv(HashMap::new())
}
fn populate_from_symbols(&mut self, symbol_table: &SymbolTable) {
for (name, symbol) in symbol_table.values.iter() {
if let SymbolSpec::Func(ref type_names) = symbol.spec {
let mut ch: char = 'a';
let mut names = vec![];
for _ in type_names.iter() {
names.push(Rc::new(format!("{}", ch)));
ch = ((ch as u8) + 1) as char;
}
let sigma = Scheme {
names: names.clone(),
ty: Type::Func(names.into_iter().map(|n| Type::Var(n)).collect())
};
self.0.insert(name.clone(), sigma);
}
}
}
}
pub struct TypeContext<'a> {
values: ScopeStack<'a, TypeName, Type>,
symbol_table_handle: Rc<RefCell<SymbolTable>>,
global_env: TypeEnv
}
impl<'a> TypeContext<'a> {
pub fn new(symbol_table_handle: Rc<RefCell<SymbolTable>>) -> TypeContext<'static> {
TypeContext { values: ScopeStack::new(None), global_env: TypeEnv::default(), symbol_table_handle }
}
pub fn debug_types(&self) -> String {
let mut output = format!("Type environment\n");
for (name, scheme) in &self.global_env.0 {
write!(output, "{} -> {}\n", name, scheme).unwrap();
}
output
}
pub fn type_check_ast(&mut self, input: &ast::AST) -> Result<String, String> {
let ref symbol_table = self.symbol_table_handle.borrow();
self.global_env.populate_from_symbols(symbol_table);
let output = self.global_env.infer_block(&input.0)?;
Ok(format!("{:?}", output))
}
}
impl TypeEnv {
fn instantiate(&mut self, sigma: Scheme) -> Type {
match sigma {
Scheme { ty, .. } => ty,
}
}
fn generate(&mut self, ty: Type) -> Scheme {
Scheme {
names: vec![], //TODO incomplete
ty
}
}
fn infer_block(&mut self, block: &Vec<ast::Statement>) -> TypeResult<Type> {
let mut output = Type::Const(TConst::Unit);
for statement in block {
output = self.infer_statement(statement)?;
}
Ok(output)
}
fn infer_statement(&mut self, statement: &ast::Statement) -> TypeResult<Type> {
match statement {
ast::Statement::ExpressionStatement(expr) => self.infer_expr(expr),
ast::Statement::Declaration(decl) => self.infer_decl(decl)
}
}
fn infer_decl(&mut self, decl: &ast::Declaration) -> TypeResult<Type> {
use ast::Declaration::*;
match decl {
Binding { name, expr, .. } => {
let ty = self.infer_expr(expr)?;
let sigma = self.generate(ty);
self.0.insert(name.clone(), sigma);
},
_ => (),
}
Ok(Type::Const(TConst::Unit))
}
fn infer_expr(&mut self, expr: &ast::Expression) -> TypeResult<Type> {
match expr {
ast::Expression(expr, Some(anno)) => {
self.infer_exprtype(expr)
},
ast::Expression(expr, None) => {
self.infer_exprtype(expr)
}
}
}
fn infer_exprtype(&mut self, expr: &ast::ExpressionType) -> TypeResult<Type> {
use self::TConst::*;
use ast::ExpressionType::*;
Ok(match expr {
NatLiteral(_) => Type::Const(Nat),
StringLiteral(_) => Type::Const(StringT),
BinExp(op, lhs, rhs) => {
return Err(format!("NOTDONE"))
},
Call { f, arguments } => {
return Err(format!("NOTDONE"))
},
Value(name) => {
let s = match self.0.get(name) {
Some(sigma) => sigma.clone(),
None => return Err(format!("Unknown variable: {}", name))
};
self.instantiate(s)
},
_ => Type::Const(Unit)
})
}
}
/* GIANT TODO - use the rust im crate, unless I make this code way less haskell-ish after it's done
*/
/*
pub type TypeResult<T> = Result<T, String>;
*/
/* TODO this should just check the name against a map, and that map should be pre-populated with
* types */
/*
impl parsing::TypeName {
fn to_type(&self) -> TypeResult<Type> {
use self::parsing::TypeSingletonName;
use self::parsing::TypeName::*;
use self::Type::*; use self::TConstOld::*;
Ok(match self {
Tuple(_) => return Err(format!("Tuples not yet implemented")),
Singleton(name) => match name {
TypeSingletonName { name, .. } => match &name[..] {
/*
"Nat" => Const(Nat),
"Int" => Const(Int),
"Float" => Const(Float),
"Bool" => Const(Bool),
"String" => Const(StringT),
*/
n => Const(Custom(n.to_string()))
}
}
})
}
}
*/
/*
impl TypeContext {
pub fn type_check_ast(&mut self, ast: &parsing::AST) -> TypeResult<String> {
let ref block = ast.0;
let mut infer = Infer::default();
let env = TypeEnvironment::default();
let output = infer.infer_block(block, &env);
match output {
Ok(s) => Ok(format!("{:?}", s)),
Err(s) => Err(format!("Error: {:?}", s))
}
}
}
// this is the equivalent of the Haskell Infer monad
#[derive(Debug, Default)]
struct Infer {
_idents: u32,
}
#[derive(Debug)]
enum InferError {
CannotUnify(MonoType, MonoType),
OccursCheckFailed(Rc<String>, MonoType),
UnknownIdentifier(Rc<String>),
Custom(String),
}
type InferResult<T> = Result<T, InferError>;
impl Infer {
fn fresh(&mut self) -> MonoType {
let i = self._idents;
self._idents += 1;
let name = Rc::new(format!("{}", ('a' as u8 + 1) as char));
MonoType::Var(name)
}
fn unify(&mut self, a: MonoType, b: MonoType) -> InferResult<Substitution> {
use self::InferError::*; use self::MonoType::*;
Ok(match (a, b) {
(Const(ref a), Const(ref b)) if a == b => Substitution::new(),
(Var(ref name), ref var) => Substitution::bind_variable(name, var),
(ref var, Var(ref name)) => Substitution::bind_variable(name, var),
(Function(box a1, box b1), Function(box a2, box b2)) => {
let s1 = self.unify(a1, a2)?;
let s2 = self.unify(b1.apply_substitution(&s1), b2.apply_substitution(&s1))?;
s1.merge(s2)
},
(a, b) => return Err(CannotUnify(a, b))
})
}
fn infer_block(&mut self, block: &Vec<parsing::Statement>, env: &TypeEnvironment) -> InferResult<MonoType> {
use self::parsing::Statement;
let mut ret = MonoType::Const(TypeConst::Unit);
for statement in block.iter() {
ret = match statement {
Statement::ExpressionStatement(expr) => {
let (sub, ty) = self.infer_expr(expr, env)?;
//TODO handle substitution monadically
ty
}
Statement::Declaration(decl) => MonoType::Const(TypeConst::Unit),
}
}
Ok(ret)
}
fn infer_expr(&mut self, expr: &parsing::Expression, env: &TypeEnvironment) -> InferResult<(Substitution, MonoType)> {
use self::parsing::Expression;
match expr {
Expression(e, Some(anno)) => self.infer_annotated_expr(e, anno, env),
/*
let anno_ty = anno.to_type()?;
let ty = self.infer_exprtype(&e)?;
self.unify(ty, anno_ty)
},
*/
Expression(e, None) => self.infer_exprtype(e, env)
}
}
fn infer_annotated_expr(&mut self, expr: &parsing::ExpressionType, anno: &parsing::TypeName, env: &TypeEnvironment) -> InferResult<(Substitution, MonoType)> {
Err(InferError::Custom(format!("exprtype not done: {:?}", expr)))
}
fn infer_exprtype(&mut self, expr: &parsing::ExpressionType, env: &TypeEnvironment) -> InferResult<(Substitution, MonoType)> {
use self::parsing::ExpressionType::*;
use self::TypeConst::*;
Ok(match expr {
NatLiteral(_) => (Substitution::new(), MonoType::Const(Nat)),
FloatLiteral(_) => (Substitution::new(), MonoType::Const(Float)),
StringLiteral(_) => (Substitution::new(), MonoType::Const(StringT)),
BoolLiteral(_) => (Substitution::new(), MonoType::Const(Bool)),
Value(name) => match env.lookup(name) {
Some(sigma) => {
let tau = self.instantiate(&sigma);
(Substitution::new(), tau)
},
None => return Err(InferError::UnknownIdentifier(name.clone())),
},
e => return Err(InferError::Custom(format!("Type inference for {:?} not done", e)))
})
}
fn instantiate(&mut self, sigma: &PolyType) -> MonoType {
let ref ty: MonoType = sigma.1;
let mut subst = Substitution::new();
for name in sigma.0.iter() {
let fresh_mvar = self.fresh();
let new = Substitution::bind_variable(name, &fresh_mvar);
subst = subst.merge(new);
}
ty.apply_substitution(&subst)
}
}
*/
/* OLD STUFF DOWN HERE */
/*
impl TypeContext {
fn infer_block(&mut self, statements: &Vec<parsing::Statement>) -> TypeResult<Type> {
let mut ret_type = Type::Const(TConst::Unit);
for statement in statements {
ret_type = self.infer_statement(statement)?;
}
Ok(ret_type)
}
fn infer_statement(&mut self, statement: &parsing::Statement) -> TypeResult<Type> {
use self::parsing::Statement::*;
match statement {
ExpressionStatement(expr) => self.infer(expr),
Declaration(decl) => self.add_declaration(decl),
}
}
fn add_declaration(&mut self, decl: &parsing::Declaration) -> TypeResult<Type> {
use self::parsing::Declaration::*;
use self::Type::*;
match decl {
Binding { name, expr, .. } => {
let ty = self.infer(expr)?;
self.bindings.insert(name.clone(), ty);
},
_ => return Err(format!("other formats not done"))
}
Ok(Void)
}
fn infer(&mut self, expr: &parsing::Expression) -> TypeResult<Type> {
use self::parsing::Expression;
match expr {
Expression(e, Some(anno)) => {
let anno_ty = anno.to_type()?;
let ty = self.infer_exprtype(&e)?;
self.unify(ty, anno_ty)
},
Expression(e, None) => self.infer_exprtype(e)
}
}
fn infer_exprtype(&mut self, expr: &parsing::ExpressionType) -> TypeResult<Type> {
use self::parsing::ExpressionType::*;
use self::Type::*; use self::TConst::*;
match expr {
NatLiteral(_) => Ok(Const(Nat)),
FloatLiteral(_) => Ok(Const(Float)),
StringLiteral(_) => Ok(Const(StringT)),
BoolLiteral(_) => Ok(Const(Bool)),
BinExp(op, lhs, rhs) => { /* remember there are both the haskell convention talk and the write you a haskell ways to do this! */
match op.get_type()? {
Func(box t1, box Func(box t2, box t3)) => {
let lhs_ty = self.infer(lhs)?;
let rhs_ty = self.infer(rhs)?;
self.unify(t1, lhs_ty)?;
self.unify(t2, rhs_ty)?;
Ok(t3)
},
other => Err(format!("{:?} is not a binary function type", other))
}
},
PrefixExp(op, expr) => match op.get_type()? {
Func(box t1, box t2) => {
let expr_ty = self.infer(expr)?;
self.unify(t1, expr_ty)?;
Ok(t2)
},
other => Err(format!("{:?} is not a prefix op function type", other))
},
Value(name) => {
match self.bindings.get(name) {
Some(ty) => Ok(ty.clone()),
None => Err(format!("No binding found for variable: {}", name)),
}
},
Call { f, arguments } => {
let mut tf = self.infer(f)?;
for arg in arguments.iter() {
match tf {
Func(box t, box rest) => {
let t_arg = self.infer(arg)?;
self.unify(t, t_arg)?;
tf = rest;
},
other => return Err(format!("Function call failed to unify; last type: {:?}", other)),
}
}
Ok(tf)
},
TupleLiteral(expressions) => {
let mut types = vec![];
for expr in expressions {
types.push(self.infer(expr)?);
}
Ok(Sum(types))
},
_ => Err(format!("Type not yet implemented"))
}
}
fn unify(&mut self, t1: Type, t2: Type) -> TypeResult<Type> {
use self::Type::*;// use self::TConst::*;
match (t1, t2) {
(Const(ref a), Const(ref b)) if a == b => Ok(Const(a.clone())),
(a, b) => Err(format!("Types {:?} and {:?} don't unify", a, b))
}
}
}
*/
#[cfg(test)]
mod tests {
/*
use super::{Type, TConst, TypeContext};
use super::Type::*;
use super::TConst::*;
use std::rc::Rc;
use std::cell::RefCell;
macro_rules! type_test {
($input:expr, $correct:expr) => {
{
let symbol_table = Rc::new(RefCell::new(SymbolTable::new()));
let mut tc = TypeContext::new(symbol_table);
let ast = ::ast::parse(::tokenizing::tokenize($input)).0.unwrap() ;
//tc.add_symbols(&ast);
assert_eq!($correct, tc.infer_block(&ast.0).unwrap())
}
}
}
#[test]
fn basic_inference() {
type_test!("30", Const(Nat));
//type_test!("fn x(a: Int): Bool {}; x(1)", TConst(Boolean));
}
*/
}

View File

@@ -27,7 +27,7 @@ impl<'a, T, V> ScopeStack<'a, T, V> where T: Hash + Eq {
(Some(value), _) => Some(value),
}
}
//TODO rename new_scope
pub fn new_scope(&'a self, name: Option<String>) -> ScopeStack<'a, T, V> where T: Hash + Eq {
ScopeStack {
parent: Some(self),
@@ -35,21 +35,8 @@ impl<'a, T, V> ScopeStack<'a, T, V> where T: Hash + Eq {
scope_name: name,
}
}
#[allow(dead_code)]
pub fn get_name(&self) -> Option<&String> {
self.scope_name.as_ref()
}
}
/// this is intended for use in tests, and does no error-handling whatsoever
#[allow(dead_code)]
pub fn quick_ast(input: &str) -> crate::ast::AST {
let tokens = crate::tokenizing::tokenize(input);
let mut parser = crate::parsing::Parser::new(tokens);
parser.parse().unwrap()
}
#[allow(unused_macros)]
macro_rules! rc {
($string:tt) => { Rc::new(stringify!($string).to_string()) }
}

View File

@@ -2,22 +2,24 @@
name = "schala-repl"
version = "0.1.0"
authors = ["greg <greg.shuflin@protonmail.com>"]
edition = "2018"
[dependencies]
llvm-sys = "70.0.2"
take_mut = "0.2.2"
llvm-sys = "*"
take_mut = "0.1.3"
itertools = "0.5.8"
getopts = "0.2.18"
getopts = "*"
lazy_static = "0.2.8"
maplit = "*"
colored = "1.8"
serde = "1.0.91"
serde_derive = "1.0.91"
serde_json = "1.0.15"
colored = "1.5"
serde = "1.0.15"
serde_derive = "1.0.15"
serde_json = "1.0.3"
rocket = "0.3.13"
rocket_codegen = "0.3.13"
rocket_contrib = "0.3.13"
phf = "0.7.12"
includedir = "0.2.0"
linefeed = "0.6.0"
linefeed = "0.5.0"
regex = "0.2"
[build-dependencies]

View File

@@ -1,80 +1,215 @@
use std::collections::HashMap;
use colored::*;
use std::fmt::Write;
use std::time;
use std::collections::HashSet;
pub struct LLVMCodeString(pub String);
#[derive(Debug, Default, Serialize, Deserialize)]
pub struct EvalOptions {
pub execution_method: ExecutionMethod,
pub debug_passes: HashMap<String, PassDebugOptionsDescriptor>,
}
#[derive(Debug, Hash, PartialEq)]
pub struct PassDescriptor {
pub name: String,
pub debug_options: Vec<String>
}
#[derive(Debug, Serialize, Deserialize)]
pub struct PassDebugOptionsDescriptor {
pub opts: Vec<String>,
}
#[derive(Debug, Serialize, Deserialize)]
pub enum ExecutionMethod {
Compile,
Interpret,
}
impl Default for ExecutionMethod {
fn default() -> ExecutionMethod {
ExecutionMethod::Interpret
}
}
#[derive(Debug, Default)]
pub struct UnfinishedComputation {
artifacts: Vec<(String, TraceArtifact)>,
pub durations: Vec<time::Duration>,
pub cur_debug_options: Vec<String>,
}
#[derive(Debug)]
pub struct FinishedComputation {
artifacts: Vec<(String, TraceArtifact)>,
durations: Vec<time::Duration>,
text_output: Result<String, String>,
}
impl UnfinishedComputation {
pub fn add_artifact(&mut self, artifact: TraceArtifact) {
self.artifacts.push((artifact.stage_name.clone(), artifact));
}
pub fn finish(self, text_output: Result<String, String>) -> FinishedComputation {
FinishedComputation {
artifacts: self.artifacts,
text_output,
durations: self.durations
}
}
pub fn output(self, output: Result<String, String>) -> FinishedComputation {
FinishedComputation {
artifacts: self.artifacts,
text_output: output,
durations: self.durations,
}
}
}
impl FinishedComputation {
pub fn to_repl(&self) -> String {
let mut buf = String::new();
for (stage, artifact) in self.artifacts.iter() {
let color = artifact.text_color;
let stage = stage.color(color).bold();
let output = artifact.debug_output.color(color);
write!(&mut buf, "{}: {}\n", stage, output).unwrap();
}
let debug_timing = true;
if debug_timing {
write!(&mut buf, "Timing: ").unwrap();
for duration in self.durations.iter() {
let timing = (duration.as_secs() as f64) + (duration.subsec_nanos() as f64 * 1e-9);
write!(&mut buf, "{}s, ", timing).unwrap()
}
write!(&mut buf, "\n").unwrap();
}
match self.text_output {
Ok(ref output) => write!(&mut buf, "{}", output).unwrap(),
Err(ref err) => write!(&mut buf, "{} {}", "Error: ".red().bold(), err).unwrap(),
}
buf
}
pub fn to_noninteractive(&self) -> Option<String> {
match self.text_output {
Ok(_) => {
let mut buf = String::new();
for (stage, artifact) in self.artifacts.iter() {
let color = artifact.text_color;
let stage = stage.color(color).bold();
let output = artifact.debug_output.color(color);
write!(&mut buf, "{}: {}\n", stage, output).unwrap();
}
if buf == "" { None } else { Some(buf) }
},
Err(ref s) => Some(format!("{} {}", "Error: ".red().bold(), s))
}
}
}
#[derive(Debug)]
pub struct TraceArtifact {
stage_name: String,
debug_output: String,
text_color: &'static str,
}
impl TraceArtifact {
pub fn new(stage: &str, debug: String) -> TraceArtifact {
let color = match stage {
"parse_trace" | "ast" => "red",
"ast_reducing" => "red",
"tokens" => "green",
"type_check" => "magenta",
_ => "blue",
};
TraceArtifact { stage_name: stage.to_string(), debug_output: debug, text_color: color}
}
pub fn new_parse_trace(trace: Vec<String>) -> TraceArtifact {
let mut output = String::new();
for t in trace {
output.push_str(&t);
output.push_str("\n");
}
TraceArtifact { stage_name: "parse_trace".to_string(), debug_output: output, text_color: "red"}
}
}
pub trait ProgrammingLanguageInterface {
fn execute_pipeline(&mut self, _input: &str, _eval_options: &EvalOptions) -> FinishedComputation {
FinishedComputation { artifacts: vec![], text_output: Err(format!("Execution pipeline not done")), durations: vec![] }
}
fn get_language_name(&self) -> String;
fn get_source_file_suffix(&self) -> String;
fn get_passes(&self) -> Vec<PassDescriptor> {
vec![]
}
fn handle_custom_interpreter_directives(&mut self, _commands: &Vec<&str>) -> Option<String> {
None
}
fn custom_interpreter_directives_help(&self) -> String {
format!(">> No custom interpreter directives specified <<")
}
}
fn run_computation(&mut self, _request: ComputationRequest) -> ComputationResponse {
ComputationResponse {
main_output: Err(format!("Computation pipeline not implemented")),
global_output_stats: GlobalOutputStats::default(),
debug_responses: vec![],
/* a pass_chain function signature looks like:
* fn(&mut ProgrammingLanguageInterface, A, Option<&mut DebugHandler>) -> Result<B, String>
*
* TODO use some kind of failure-handling library to make this better
*/
#[macro_export]
macro_rules! pass_chain {
($state:expr, $options:expr; $($pass:path), *) => {
|text_input| {
let mut comp = UnfinishedComputation::default();
pass_chain_helper! { ($state, comp, $options); text_input $(, $pass)* }
}
}
fn request_meta(&mut self, _request: LangMetaRequest) -> LangMetaResponse {
LangMetaResponse::Custom { kind: format!("not-implemented"), value: format!("") }
}
};
}
pub struct ComputationRequest<'a> {
pub source: &'a str,
pub debug_requests: HashSet<DebugAsk>,
}
pub struct ComputationResponse {
pub main_output: Result<String, String>,
pub global_output_stats: GlobalOutputStats,
pub debug_responses: Vec<DebugResponse>,
}
#[derive(Default, Debug)]
pub struct GlobalOutputStats {
pub total_duration: time::Duration,
pub stage_durations: Vec<(String, time::Duration)>
}
#[derive(Debug, Clone, Hash, Eq, PartialEq, Deserialize, Serialize)]
pub enum DebugAsk {
Timing,
ByStage { stage_name: String, token: Option<String> },
}
impl DebugAsk {
pub fn is_for_stage(&self, name: &str) -> bool {
match self {
DebugAsk::ByStage { stage_name, .. } if stage_name == name => true,
_ => false
#[macro_export]
macro_rules! pass_chain_helper {
(($state:expr, $comp:expr, $options:expr); $input:expr, $pass:path $(, $rest:path)*) => {
{
use std::time;
use schala_repl::PassDebugOptionsDescriptor;
let pass_name = stringify!($pass);
let (output, duration) = {
let ref debug_map = $options.debug_passes;
let debug_handle = match debug_map.get(pass_name) {
Some(PassDebugOptionsDescriptor { opts }) => {
let ptr = &mut $comp;
ptr.cur_debug_options = opts.clone();
Some(ptr)
}
_ => None
};
let start = time::Instant::now();
let pass_output = $pass($state, $input, debug_handle);
let elapsed = start.elapsed();
(pass_output, elapsed)
};
$comp.durations.push(duration);
match output {
Ok(result) => pass_chain_helper! { ($state, $comp, $options); result $(, $rest)* },
Err(err) => {
$comp.output(Err(format!("Pass {} failed with {:?}", pass_name, err)))
}
}
}
}
}
pub struct DebugResponse {
pub ask: DebugAsk,
pub value: String
}
pub enum LangMetaRequest {
StageNames,
Docs {
source: String,
},
Custom {
kind: String,
value: String
},
ImmediateDebug(DebugAsk),
}
pub enum LangMetaResponse {
StageNames(Vec<String>),
Docs {
doc_string: String,
},
Custom {
kind: String,
value: String
},
ImmediateDebug(DebugResponse),
};
// Done
(($state:expr, $comp:expr, $options:expr); $final_output:expr) => {
{
let final_output: FinishedComputation = $comp.finish(Ok($final_output));
final_output
}
};
}

View File

@@ -1,6 +1,7 @@
#![feature(link_args)]
#![feature(slice_patterns, box_patterns, box_syntax, proc_macro_hygiene, decl_macro)]
#![feature(slice_patterns, box_patterns, box_syntax)]
#![feature(plugin)]
#![plugin(rocket_codegen)]
extern crate getopts;
extern crate linefeed;
extern crate itertools;
@@ -9,49 +10,91 @@ extern crate colored;
#[macro_use]
extern crate serde_derive;
extern crate serde_json;
extern crate rocket;
extern crate rocket_contrib;
extern crate includedir;
extern crate phf;
use std::collections::HashSet;
use std::path::Path;
use std::fs::File;
use std::io::Read;
use std::io::{Read, Write};
use std::process::exit;
use std::default::Default;
use std::fmt::Write as FmtWrite;
use colored::*;
use itertools::Itertools;
mod repl;
mod language;
mod webapp;
pub mod llvm_wrap;
pub use language::{ProgrammingLanguageInterface,
ComputationRequest, ComputationResponse,
LangMetaRequest, LangMetaResponse,
DebugResponse, DebugAsk, GlobalOutputStats};
include!(concat!(env!("OUT_DIR"), "/static.rs"));
const VERSION_STRING: &'static str = "0.1.0";
pub fn start_repl(langs: Vec<Box<dyn ProgrammingLanguageInterface>>) {
let options = command_line_options().parse(std::env::args()).unwrap_or_else(|e| {
include!(concat!(env!("OUT_DIR"), "/static.rs"));
pub use language::{LLVMCodeString, ProgrammingLanguageInterface, EvalOptions,
ExecutionMethod, TraceArtifact, FinishedComputation, UnfinishedComputation, PassDebugOptionsDescriptor, PassDescriptor};
pub type PLIGenerator = Box<Fn() -> Box<ProgrammingLanguageInterface> + Send + Sync>;
pub fn repl_main(generators: Vec<PLIGenerator>) {
let languages: Vec<Box<ProgrammingLanguageInterface>> = generators.iter().map(|x| x()).collect();
let option_matches = program_options().parse(std::env::args()).unwrap_or_else(|e| {
println!("{:?}", e);
exit(1);
});
if options.opt_present("help") {
println!("{}", command_line_options().usage("Schala metainterpreter"));
if option_matches.opt_present("list-languages") {
for lang in languages {
println!("{}", lang.get_language_name());
}
exit(1);
}
if option_matches.opt_present("help") {
println!("{}", program_options().usage("Schala metainterpreter"));
exit(0);
}
match options.free[..] {
if option_matches.opt_present("webapp") {
webapp::web_main(generators);
exit(0);
}
let mut options = EvalOptions::default();
let debug_passes = if let Some(opts) = option_matches.opt_str("debug") {
let output: Vec<String> = opts.split_terminator(",").map(|s| s.to_string()).collect();
output
} else {
vec![]
};
let language_names: Vec<String> = languages.iter().map(|lang| {lang.get_language_name()}).collect();
let initial_index: usize =
option_matches.opt_str("lang")
.and_then(|lang| { language_names.iter().position(|x| { x.to_lowercase() == lang.to_lowercase() }) })
.unwrap_or(0);
options.execution_method = match option_matches.opt_str("eval-style") {
Some(ref s) if s == "compile" => ExecutionMethod::Compile,
_ => ExecutionMethod::Interpret,
};
match option_matches.free[..] {
[] | [_] => {
let mut repl = repl::Repl::new(langs);
repl.run_repl();
let mut repl = Repl::new(languages, initial_index);
repl.run();
}
[_, ref filename, ..] => {
run_noninteractive(filename, langs);
[_, ref filename, _..] => {
run_noninteractive(filename, languages, options, debug_passes);
}
};
}
fn run_noninteractive(filename: &str, languages: Vec<Box<dyn ProgrammingLanguageInterface>>) {
fn run_noninteractive(filename: &str, languages: Vec<Box<ProgrammingLanguageInterface>>, mut options: EvalOptions, debug_passes: Vec<String>) {
let path = Path::new(filename);
let ext = path.extension().and_then(|e| e.to_str()).unwrap_or_else(|| {
println!("Source file lacks extension");
@@ -65,28 +108,454 @@ fn run_noninteractive(filename: &str, languages: Vec<Box<dyn ProgrammingLanguage
let mut source_file = File::open(path).unwrap();
let mut buffer = String::new();
source_file.read_to_string(&mut buffer).unwrap();
let request = ComputationRequest {
source: &buffer,
debug_requests: HashSet::new(),
};
for pass in debug_passes.into_iter() {
if let Some(_) = language.get_passes().iter().find(|desc| desc.name == pass) {
options.debug_passes.insert(pass, PassDebugOptionsDescriptor { opts: vec![] });
}
}
let response = language.run_computation(request);
match response.main_output {
Ok(s) => println!("{}", s),
Err(s) => println!("{}", s)
};
match options.execution_method {
ExecutionMethod::Compile => {
/*
let llvm_bytecode = language.compile(&buffer);
compilation_sequence(llvm_bytecode, filename);
*/
panic!("Not ready to go yet");
},
ExecutionMethod::Interpret => {
let output = language.execute_pipeline(&buffer, &options);
output.to_noninteractive().map(|text| println!("{}", text));
}
}
}
#[derive(Clone)]
enum CommandTree {
Terminal(String, Option<String>),
NonTerminal(String, Vec<CommandTree>, Option<String>),
Top(Vec<CommandTree>),
}
fn command_line_options() -> getopts::Options {
impl CommandTree {
fn term(s: &str, help: Option<&str>) -> CommandTree {
CommandTree::Terminal(s.to_string(), help.map(|x| x.to_string()))
}
fn get_cmd(&self) -> String {
match self {
CommandTree::Terminal(s, _) => s.to_string(),
CommandTree::NonTerminal(s, _, _) => s.to_string(),
CommandTree::Top(_) => "".to_string(),
}
}
fn get_help(&self) -> String {
match self {
CommandTree::Terminal(_, h) => h.as_ref().map(|h| h.clone()).unwrap_or(format!("")),
CommandTree::NonTerminal(_, _, h) => h.as_ref().map(|h| h.clone()).unwrap_or(format!("")),
CommandTree::Top(_) => "".to_string(),
}
}
fn get_children(&self) -> Vec<String> {
match self {
CommandTree::Terminal(_, _) => vec![],
CommandTree::NonTerminal(_, children, _) => children.iter().map(|x| x.get_cmd()).collect(),
CommandTree::Top(children) => children.iter().map(|x| x.get_cmd()).collect(),
}
}
}
struct TabCompleteHandler {
sigil: char,
top_level_commands: CommandTree,
}
use linefeed::complete::{Completion, Completer};
use linefeed::terminal::Terminal;
impl TabCompleteHandler {
fn new(sigil: char, top_level_commands: CommandTree) -> TabCompleteHandler {
TabCompleteHandler {
top_level_commands,
sigil,
}
}
}
impl<T: Terminal> Completer<T> for TabCompleteHandler {
fn complete(&self, word: &str, prompter: &linefeed::prompter::Prompter<T>, start: usize, _end: usize) -> Option<Vec<Completion>> {
let line = prompter.buffer();
if line.starts_with(&format!("{}", self.sigil)) {
let mut words = line[1..(if start == 0 { 1 } else { start })].split_whitespace();
let mut completions = Vec::new();
let mut command_tree: Option<&CommandTree> = Some(&self.top_level_commands);
loop {
match words.next() {
None => {
let top = match command_tree {
Some(CommandTree::Top(_)) => true,
_ => false
};
let word = if top { word.get(1..).unwrap() } else { word };
for cmd in command_tree.map(|x| x.get_children()).unwrap_or(vec![]).into_iter() {
if cmd.starts_with(word) {
completions.push(Completion {
completion: format!("{}{}", if top { ":" } else { "" }, cmd),
display: Some(cmd.clone()),
suffix: linefeed::complete::Suffix::Some(' ')
})
}
}
break;
},
Some(s) => {
let new_ptr: Option<&CommandTree> = command_tree.and_then(|cm| match cm {
CommandTree::Top(children) => children.iter().find(|c| c.get_cmd() == s),
CommandTree::NonTerminal(_, children, _) => children.iter().find(|c| c.get_cmd() == s),
CommandTree::Terminal(_, _) => None,
});
command_tree = new_ptr;
}
}
}
Some(completions)
} else {
None
}
}
}
struct Repl {
options: EvalOptions,
languages: Vec<Box<ProgrammingLanguageInterface>>,
current_language_index: usize,
interpreter_directive_sigil: char,
line_reader: linefeed::interface::Interface<linefeed::terminal::DefaultTerminal>,
}
impl Repl {
fn new(languages: Vec<Box<ProgrammingLanguageInterface>>, initial_index: usize) -> Repl {
use linefeed::Interface;
let i = if initial_index < languages.len() { initial_index } else { 0 };
let line_reader = Interface::new("schala-repl").unwrap();
Repl {
options: Repl::get_options(),
languages: languages,
current_language_index: i,
interpreter_directive_sigil: ':',
line_reader
}
}
fn get_cur_language(&self) -> &ProgrammingLanguageInterface {
self.languages[self.current_language_index].as_ref()
}
fn get_options() -> EvalOptions {
File::open(".schala_repl")
.and_then(|mut file| {
let mut contents = String::new();
file.read_to_string(&mut contents)?;
Ok(contents)
})
.and_then(|contents| {
let options: EvalOptions = serde_json::from_str(&contents)?;
Ok(options)
}).unwrap_or(EvalOptions::default())
}
fn save_options(&self) {
let ref options = self.options;
let read = File::create(".schala_repl")
.and_then(|mut file| {
let buf = serde_json::to_string(options).unwrap();
file.write_all(buf.as_bytes())
});
if let Err(err) = read {
println!("Error saving .schala_repl file {}", err);
}
}
fn run(&mut self) {
use linefeed::ReadResult;
println!("Schala MetaInterpreter version {}", VERSION_STRING);
println!("Type {}help for help with the REPL", self.interpreter_directive_sigil);
self.line_reader.load_history(".schala_history").unwrap_or(());
loop {
let language_name = self.languages[self.current_language_index].get_language_name();
let directives = self.get_directives();
let tab_complete_handler = TabCompleteHandler::new(self.interpreter_directive_sigil, directives);
self.line_reader.set_completer(std::sync::Arc::new(tab_complete_handler));
let prompt_str = format!("{} >> ", language_name);
self.line_reader.set_prompt(&prompt_str);
match self.line_reader.read_line() {
Err(e) => {
println!("Terminal read error: {}", e);
},
Ok(ReadResult::Eof) => break,
Ok(ReadResult::Signal(_)) => break,
Ok(ReadResult::Input(ref input)) => {
self.line_reader.add_history_unique(input.to_string());
let output = match input.chars().nth(0) {
Some(ch) if ch == self.interpreter_directive_sigil => self.handle_interpreter_directive(input),
_ => Some(self.input_handler(input)),
};
if let Some(o) = output {
println!("=> {}", o);
}
}
}
}
self.line_reader.save_history(".schala_history").unwrap_or(());
self.save_options();
println!("Exiting...");
}
fn input_handler(&mut self, input: &str) -> String {
let ref mut language = self.languages[self.current_language_index];
let interpreter_output = language.execute_pipeline(input, &self.options);
interpreter_output.to_repl()
}
fn get_directives(&self) -> CommandTree {
let ref passes = self.get_cur_language().get_passes();
let passes_directives: Vec<CommandTree> = passes.iter()
.map(|pass_descriptor| {
let name = &pass_descriptor.name;
if pass_descriptor.debug_options.len() == 0 {
CommandTree::term(name, None)
} else {
let sub_opts: Vec<CommandTree> = pass_descriptor.debug_options.iter()
.map(|o| CommandTree::term(o, None)).collect();
CommandTree::NonTerminal(
name.clone(),
sub_opts,
None
)
}
}).collect();
CommandTree::Top(vec![
CommandTree::term("exit", Some("exit the REPL")),
CommandTree::term("quit", Some("exit the REPL")),
CommandTree::term("help", Some("Print this help message")),
CommandTree::NonTerminal(format!("debug"), vec![
CommandTree::term("passes", None),
CommandTree::NonTerminal(format!("show"), passes_directives.clone(), None),
CommandTree::NonTerminal(format!("hide"), passes_directives.clone(), None),
], Some(format!("show or hide pass info for a given pass, or display the names of all passes"))),
CommandTree::NonTerminal(format!("lang"), vec![
CommandTree::term("next", None),
CommandTree::term("prev", None),
CommandTree::NonTerminal(format!("go"), vec![], None)//TODO
], Some(format!("switch between languages, or go directly to a langauge by name"))),
])
}
fn handle_interpreter_directive(&mut self, input: &str) -> Option<String> {
let mut iter = input.chars();
iter.next();
let commands: Vec<&str> = iter
.as_str()
.split_whitespace()
.collect();
let cmd: &str = match commands.get(0).clone() {
None => return None,
Some(s) => s
};
match cmd {
"exit" | "quit" => {
self.save_options();
exit(0)
},
"lang" | "language" => match commands.get(1) {
Some(&"show") => {
let mut buf = String::new();
for (i, lang) in self.languages.iter().enumerate() {
write!(buf, "{}{}\n", if i == self.current_language_index { "* "} else { "" }, lang.get_language_name()).unwrap();
}
Some(buf)
},
Some(&"go") => match commands.get(2) {
None => Some(format!("Must specify a language name")),
Some(&desired_name) => {
for (i, _) in self.languages.iter().enumerate() {
let lang_name = self.languages[i].get_language_name();
if lang_name.to_lowercase() == desired_name.to_lowercase() {
self.current_language_index = i;
return Some(format!("Switching to {}", self.languages[self.current_language_index].get_language_name()));
}
}
Some(format!("Language {} not found", desired_name))
}
},
Some(&"next") | Some(&"n") => {
self.current_language_index = (self.current_language_index + 1) % self.languages.len();
Some(format!("Switching to {}", self.languages[self.current_language_index].get_language_name()))
},
Some(&"previous") | Some(&"p") | Some(&"prev") => {
self.current_language_index = if self.current_language_index == 0 { self.languages.len() - 1 } else { self.current_language_index - 1 };
Some(format!("Switching to {}", self.languages[self.current_language_index].get_language_name()))
},
Some(e) => Some(format!("Bad `lang(uage)` argument: {}", e)),
None => Some(format!("Valid arguments for `lang(uage)` are `show`, `next`|`n`, `previous`|`prev`|`n`"))
},
"help" => {
let mut buf = String::new();
let ref lang = self.languages[self.current_language_index];
let directives = match self.get_directives() {
CommandTree::Top(children) => children,
_ => panic!("Top-level CommandTree not Top")
};
writeln!(buf, "MetaInterpreter options").unwrap();
writeln!(buf, "-----------------------").unwrap();
for directive in directives {
let trailer = " ";
writeln!(buf, "{}{}- {}", directive.get_cmd(), trailer, directive.get_help()).unwrap();
}
writeln!(buf, "").unwrap();
writeln!(buf, "Language-specific help for {}", lang.get_language_name()).unwrap();
writeln!(buf, "-----------------------").unwrap();
writeln!(buf, "{}", lang.custom_interpreter_directives_help()).unwrap();
Some(buf)
},
"debug" => self.handle_debug(commands),
e => self.languages[self.current_language_index]
.handle_custom_interpreter_directives(&commands)
.or(Some(format!("Unknown command: {}", e)))
}
}
fn handle_debug(&mut self, commands: Vec<&str>) -> Option<String> {
let passes = self.get_cur_language().get_passes();
match commands.get(1) {
Some(&"passes") => Some(
passes.into_iter()
.map(|desc| {
if self.options.debug_passes.contains_key(&desc.name) {
let color = "green";
format!("*{}", desc.name.color(color))
} else {
desc.name
}
})
.intersperse(format!(" -> "))
.collect()),
b @ Some(&"show") | b @ Some(&"hide") => {
let show = b == Some(&"show");
let debug_pass: String = match commands.get(2) {
Some(s) => s.to_string(),
None => return Some(format!("Must specify a stage to debug")),
};
let pass_opt = commands.get(3);
if let Some(desc) = passes.iter().find(|desc| desc.name == debug_pass) {
let mut opts = vec![];
if let Some(opt) = pass_opt {
opts.push(opt.to_string());
}
let msg = format!("{} debug for pass {}", if show { "Enabling" } else { "Disabling" }, debug_pass);
if show {
self.options.debug_passes.insert(desc.name.clone(), PassDebugOptionsDescriptor { opts });
} else {
self.options.debug_passes.remove(&desc.name);
}
Some(msg)
} else {
Some(format!("Couldn't find stage: {}", debug_pass))
}
},
_ => Some(format!("Unknown debug command"))
}
}
}
/*
pub fn compilation_sequence(llvm_code: LLVMCodeString, sourcefile: &str) {
use std::process::Command;
let ll_filename = "out.ll";
let obj_filename = "out.o";
let q: Vec<&str> = sourcefile.split('.').collect();
let bin_filename = match &q[..] {
&[name, "maaru"] => name,
_ => panic!("Bad filename {}", sourcefile),
};
let LLVMCodeString(llvm_str) = llvm_code;
println!("Compilation process finished for {}", ll_filename);
File::create(ll_filename)
.and_then(|mut f| f.write_all(llvm_str.as_bytes()))
.expect("Error writing file");
let llc_output = Command::new("llc")
.args(&["-filetype=obj", ll_filename, "-o", obj_filename])
.output()
.expect("Failed to run llc");
if !llc_output.status.success() {
println!("{}", String::from_utf8_lossy(&llc_output.stderr));
}
let gcc_output = Command::new("gcc")
.args(&["-o", bin_filename, &obj_filename])
.output()
.expect("failed to run gcc");
if !gcc_output.status.success() {
println!("{}", String::from_utf8_lossy(&gcc_output.stdout));
println!("{}", String::from_utf8_lossy(&gcc_output.stderr));
}
for filename in [obj_filename].iter() {
Command::new("rm")
.arg(filename)
.output()
.expect(&format!("failed to run rm {}", filename));
}
}
*/
fn program_options() -> getopts::Options {
let mut options = getopts::Options::new();
options.optopt("s",
"eval-style",
"Specify whether to compile (if supported) or interpret the language. If not specified, the default is language-specific",
"[compile|interpret]"
);
options.optflag("",
"list-languages",
"Show a list of all supported languages");
options.optopt("l",
"lang",
"Start up REPL in a language",
"LANGUAGE");
options.optflag("h",
"help",
"Show help text");
options.optflag("w",
"webapp",
"Start up web interpreter");
options.optopt("d",
"debug",
"Debug a stage (l = tokenizer, a = AST, r = parse trace, s = symbol table)",
"[l|a|r|s]");
options
}

View File

@@ -0,0 +1,279 @@
#![allow(non_snake_case)]
#![allow(dead_code)]
extern crate llvm_sys;
use self::llvm_sys::{LLVMIntPredicate, LLVMRealPredicate};
use self::llvm_sys::prelude::*;
use self::llvm_sys::core;
use std::ptr;
use std::ffi::{CString, CStr};
use std::os::raw::c_char;
pub fn create_context() -> LLVMContextRef {
unsafe { core::LLVMContextCreate() }
}
pub fn module_create_with_name(name: &str) -> LLVMModuleRef {
unsafe {
let n = name.as_ptr() as *const _;
core::LLVMModuleCreateWithName(n)
}
}
pub fn CreateBuilderInContext(context: LLVMContextRef) -> LLVMBuilderRef {
unsafe { core::LLVMCreateBuilderInContext(context) }
}
pub fn AppendBasicBlockInContext(context: LLVMContextRef,
function: LLVMValueRef,
name: &str)
-> LLVMBasicBlockRef {
let c_name = CString::new(name).unwrap();
unsafe { core::LLVMAppendBasicBlockInContext(context, function, c_name.as_ptr()) }
}
pub fn AddFunction(module: LLVMModuleRef, name: &str, function_type: LLVMTypeRef) -> LLVMValueRef {
let c_name = CString::new(name).unwrap();
unsafe { core::LLVMAddFunction(module, c_name.as_ptr(), function_type) }
}
pub fn FunctionType(return_type: LLVMTypeRef,
mut param_types: Vec<LLVMTypeRef>,
is_var_rag: bool)
-> LLVMTypeRef {
let len = param_types.len();
unsafe {
let pointer = param_types.as_mut_ptr();
core::LLVMFunctionType(return_type,
pointer,
len as u32,
if is_var_rag { 1 } else { 0 })
}
}
pub fn GetNamedFunction(module: LLVMModuleRef,
name: &str) -> Option<LLVMValueRef> {
let c_name = CString::new(name).unwrap();
let ret = unsafe { core::LLVMGetNamedFunction(module, c_name.as_ptr()) };
if ret.is_null() {
None
} else {
Some(ret)
}
}
pub fn VoidTypeInContext(context: LLVMContextRef) -> LLVMTypeRef {
unsafe { core::LLVMVoidTypeInContext(context) }
}
pub fn DisposeBuilder(builder: LLVMBuilderRef) {
unsafe { core::LLVMDisposeBuilder(builder) }
}
pub fn DisposeModule(module: LLVMModuleRef) {
unsafe { core::LLVMDisposeModule(module) }
}
pub fn ContextDispose(context: LLVMContextRef) {
unsafe { core::LLVMContextDispose(context) }
}
pub fn PositionBuilderAtEnd(builder: LLVMBuilderRef, basic_block: LLVMBasicBlockRef) {
unsafe { core::LLVMPositionBuilderAtEnd(builder, basic_block) }
}
pub fn BuildRet(builder: LLVMBuilderRef, val: LLVMValueRef) -> LLVMValueRef {
unsafe { core::LLVMBuildRet(builder, val) }
}
pub fn BuildRetVoid(builder: LLVMBuilderRef) -> LLVMValueRef {
unsafe { core::LLVMBuildRetVoid(builder) }
}
pub fn DumpModule(module: LLVMModuleRef) {
unsafe { core::LLVMDumpModule(module) }
}
pub fn Int64TypeInContext(context: LLVMContextRef) -> LLVMTypeRef {
unsafe { core::LLVMInt64TypeInContext(context) }
}
pub fn ConstInt(int_type: LLVMTypeRef, n: u64, sign_extend: bool) -> LLVMValueRef {
unsafe { core::LLVMConstInt(int_type, n, if sign_extend { 1 } else { 0 }) }
}
pub fn BuildAdd(builder: LLVMBuilderRef,
lhs: LLVMValueRef,
rhs: LLVMValueRef,
reg_name: &str)
-> LLVMValueRef {
let name = CString::new(reg_name).unwrap();
unsafe { core::LLVMBuildAdd(builder, lhs, rhs, name.as_ptr()) }
}
pub fn BuildSub(builder: LLVMBuilderRef,
lhs: LLVMValueRef,
rhs: LLVMValueRef,
reg_name: &str)
-> LLVMValueRef {
let name = CString::new(reg_name).unwrap();
unsafe { core::LLVMBuildSub(builder, lhs, rhs, name.as_ptr()) }
}
pub fn BuildMul(builder: LLVMBuilderRef,
lhs: LLVMValueRef,
rhs: LLVMValueRef,
reg_name: &str)
-> LLVMValueRef {
let name = CString::new(reg_name).unwrap();
unsafe { core::LLVMBuildMul(builder, lhs, rhs, name.as_ptr()) }
}
pub fn BuildUDiv(builder: LLVMBuilderRef,
lhs: LLVMValueRef,
rhs: LLVMValueRef,
reg_name: &str)
-> LLVMValueRef {
let name = CString::new(reg_name).unwrap();
unsafe { core::LLVMBuildUDiv(builder, lhs, rhs, name.as_ptr()) }
}
pub fn BuildSRem(builder: LLVMBuilderRef,
lhs: LLVMValueRef,
rhs: LLVMValueRef,
reg_name: &str)
-> LLVMValueRef {
let name = CString::new(reg_name).unwrap();
unsafe { core::LLVMBuildSRem(builder, lhs, rhs, name.as_ptr()) }
}
pub fn BuildCondBr(builder: LLVMBuilderRef,
if_expr: LLVMValueRef,
then_expr: LLVMBasicBlockRef,
else_expr: LLVMBasicBlockRef) -> LLVMValueRef {
unsafe { core::LLVMBuildCondBr(builder, if_expr, then_expr, else_expr) }
}
pub fn BuildBr(builder: LLVMBuilderRef,
dest: LLVMBasicBlockRef) -> LLVMValueRef {
unsafe { core::LLVMBuildBr(builder, dest) }
}
pub fn GetInsertBlock(builder: LLVMBuilderRef) -> LLVMBasicBlockRef {
unsafe { core::LLVMGetInsertBlock(builder) }
}
pub fn BuildPhi(builder: LLVMBuilderRef, ty: LLVMTypeRef, name: &str) -> LLVMValueRef {
let name = CString::new(name).unwrap();
unsafe { core::LLVMBuildPhi(builder, ty, name.as_ptr()) }
}
pub fn SetValueName(value: LLVMValueRef, name: &str) {
let name = CString::new(name).unwrap();
unsafe {
core::LLVMSetValueName(value, name.as_ptr())
}
}
pub fn GetValueName(value: LLVMValueRef) -> String {
unsafe {
let name_ptr: *const c_char = core::LLVMGetValueName(value);
CStr::from_ptr(name_ptr).to_string_lossy().into_owned()
}
}
pub fn GetParams(function: LLVMValueRef) -> Vec<LLVMValueRef> {
let size = CountParams(function);
unsafe {
let mut container = Vec::with_capacity(size);
container.set_len(size);
core::LLVMGetParams(function, container.as_mut_ptr());
container
}
}
pub fn CountParams(function: LLVMValueRef) -> usize {
unsafe { core::LLVMCountParams(function) as usize }
}
pub fn BuildFCmp(builder: LLVMBuilderRef,
op: LLVMRealPredicate,
lhs: LLVMValueRef,
rhs: LLVMValueRef,
name: &str) -> LLVMValueRef {
let name = CString::new(name).unwrap();
unsafe { core::LLVMBuildFCmp(builder, op, lhs, rhs, name.as_ptr()) }
}
pub fn BuildZExt(builder: LLVMBuilderRef,
val: LLVMValueRef,
dest_type: LLVMTypeRef,
name: &str) -> LLVMValueRef {
let name = CString::new(name).unwrap();
unsafe { core::LLVMBuildZExt(builder, val, dest_type, name.as_ptr()) }
}
pub fn BuildUIToFP(builder: LLVMBuilderRef,
val: LLVMValueRef,
dest_type: LLVMTypeRef,
name: &str) -> LLVMValueRef {
let name = CString::new(name).unwrap();
unsafe { core::LLVMBuildUIToFP(builder, val, dest_type, name.as_ptr()) }
}
pub fn BuildICmp(builder: LLVMBuilderRef,
op: LLVMIntPredicate,
lhs: LLVMValueRef,
rhs: LLVMValueRef,
name: &str) -> LLVMValueRef {
let name = CString::new(name).unwrap();
unsafe { core::LLVMBuildICmp(builder, op, lhs, rhs, name.as_ptr()) }
}
pub fn GetBasicBlockParent(block: LLVMBasicBlockRef) -> LLVMValueRef {
unsafe { core::LLVMGetBasicBlockParent(block) }
}
pub fn GetBasicBlocks(function: LLVMValueRef) -> Vec<LLVMBasicBlockRef> {
let size = CountBasicBlocks(function);
unsafe {
let mut container = Vec::with_capacity(size);
container.set_len(size);
core::LLVMGetBasicBlocks(function, container.as_mut_ptr());
container
}
}
pub fn CountBasicBlocks(function: LLVMValueRef) -> usize {
unsafe { core::LLVMCountBasicBlocks(function) as usize }
}
pub fn PrintModuleToString(module: LLVMModuleRef) -> String {
unsafe {
let str_ptr: *const c_char = core::LLVMPrintModuleToString(module);
CStr::from_ptr(str_ptr).to_string_lossy().into_owned()
}
}
pub fn AddIncoming(phi_node: LLVMValueRef, mut incoming_values: Vec<LLVMValueRef>,
mut incoming_blocks: Vec<LLVMBasicBlockRef>) {
let count = incoming_blocks.len() as u32;
if incoming_values.len() as u32 != count {
panic!("Bad invocation of AddIncoming");
}
unsafe {
let vals = incoming_values.as_mut_ptr();
let blocks = incoming_blocks.as_mut_ptr();
core::LLVMAddIncoming(phi_node, vals, blocks, count)
}
}
pub fn PrintModuleToFile(module: LLVMModuleRef, filename: &str) -> LLVMBool {
let out_file = CString::new(filename).unwrap();
unsafe { core::LLVMPrintModuleToFile(module, out_file.as_ptr(), ptr::null_mut()) }
}

View File

@@ -1,99 +0,0 @@
use super::{Repl, InterpreterDirectiveOutput};
use crate::repl::directive_actions::DirectiveAction;
use colored::*;
/// A CommandTree is either a `Terminal` or a `NonTerminal`. When command parsing reaches the first
/// Terminal, it will use the `DirectiveAction` found there to find an appropriate function to execute,
/// and then execute it with any remaining arguments
#[derive(Clone)]
pub enum CommandTree {
Terminal {
name: String,
children: Vec<CommandTree>,
help_msg: Option<String>,
action: DirectiveAction,
},
NonTerminal {
name: String,
children: Vec<CommandTree>,
help_msg: Option<String>,
action: DirectiveAction,
},
Top(Vec<CommandTree>),
}
impl CommandTree {
pub fn nonterm_no_further_tab_completions(s: &str, help: Option<&str>) -> CommandTree {
CommandTree::NonTerminal {name: s.to_string(), help_msg: help.map(|x| x.to_string()), children: vec![], action: DirectiveAction::Null }
}
pub fn terminal(s: &str, help: Option<&str>, children: Vec<CommandTree>, action: DirectiveAction) -> CommandTree {
CommandTree::Terminal {name: s.to_string(), help_msg: help.map(|x| x.to_string()), children, action}
}
pub fn nonterm(s: &str, help: Option<&str>, children: Vec<CommandTree>) -> CommandTree {
CommandTree::NonTerminal {
name: s.to_string(),
help_msg: help.map(|x| x.to_string()),
children,
action: DirectiveAction::Null
}
}
pub fn get_cmd(&self) -> &str {
match self {
CommandTree::Terminal { name, .. } => name.as_str(),
CommandTree::NonTerminal {name, ..} => name.as_str(),
CommandTree::Top(_) => "",
}
}
pub fn get_help(&self) -> &str {
match self {
CommandTree::Terminal { help_msg, ..} => help_msg.as_ref().map(|s| s.as_str()).unwrap_or(""),
CommandTree::NonTerminal { help_msg, .. } => help_msg.as_ref().map(|s| s.as_str()).unwrap_or(""),
CommandTree::Top(_) => ""
}
}
pub fn get_children(&self) -> &Vec<CommandTree> {
use CommandTree::*;
match self {
Terminal { children, .. } |
NonTerminal { children, .. } |
Top(children) => children
}
}
pub fn get_subcommands(&self) -> Vec<&str> {
self.get_children().iter().map(|x| x.get_cmd()).collect()
}
pub fn perform(&self, repl: &mut Repl, arguments: &Vec<&str>) -> InterpreterDirectiveOutput {
let mut dir_pointer: &CommandTree = self;
let mut idx = 0;
let res: Result<(DirectiveAction, usize), String> = loop {
match dir_pointer {
CommandTree::Top(subcommands) | CommandTree::NonTerminal { children: subcommands, .. } => {
let next_command = match arguments.get(idx) {
Some(cmd) => cmd,
None => break Err(format!("Command requires arguments"))
};
idx += 1;
match subcommands.iter().find(|sc| sc.get_cmd() == *next_command) {
Some(command_tree) => {
dir_pointer = command_tree;
},
None => break Err(format!("Command {} not found", next_command))
};
},
CommandTree::Terminal { action, .. } => {
break Ok((action.clone(), idx));
},
}
};
match res {
Ok((action, idx)) => action.perform(repl, &arguments[idx..]),
Err(err) => Some(err.red().to_string())
}
}
}

View File

@@ -1,133 +0,0 @@
use super::{Repl, InterpreterDirectiveOutput};
use crate::repl::help::help;
use crate::language::{LangMetaRequest, LangMetaResponse, DebugAsk, DebugResponse};
use itertools::Itertools;
use std::fmt::Write as FmtWrite;
#[derive(Debug, Clone)]
pub enum DirectiveAction {
Null,
Help,
QuitProgram,
ListPasses,
ShowImmediate,
Show,
Hide,
TotalTimeOff,
TotalTimeOn,
StageTimeOff,
StageTimeOn,
Doc,
}
impl DirectiveAction {
pub fn perform(&self, repl: &mut Repl, arguments: &[&str]) -> InterpreterDirectiveOutput {
use DirectiveAction::*;
match self {
Null => None,
Help => help(repl, arguments),
QuitProgram => {
repl.save_before_exit();
::std::process::exit(0)
},
ListPasses => {
let language_state = repl.get_cur_language_state();
let pass_names = match language_state.request_meta(LangMetaRequest::StageNames) {
LangMetaResponse::StageNames(names) => names,
_ => vec![],
};
let mut buf = String::new();
for pass in pass_names.iter().map(|name| Some(name)).intersperse(None) {
match pass {
Some(pass) => write!(buf, "{}", pass).unwrap(),
None => write!(buf, " -> ").unwrap(),
}
}
Some(buf)
},
ShowImmediate => {
let cur_state = repl.get_cur_language_state();
let stage_name = match arguments.get(0) {
Some(s) => s.to_string(),
None => return Some(format!("Must specify a thing to debug")),
};
let meta = LangMetaRequest::ImmediateDebug(DebugAsk::ByStage { stage_name: stage_name.clone(), token: None });
let meta_response = cur_state.request_meta(meta);
let response = match meta_response {
LangMetaResponse::ImmediateDebug(DebugResponse { ask, value }) => match ask {
DebugAsk::ByStage { stage_name: ref this_stage_name, ..} if *this_stage_name == stage_name => value,
_ => return Some(format!("Wrong debug stage"))
},
_ => return Some(format!("Invalid language meta response")),
};
Some(response)
},
Show => {
let this_stage_name = match arguments.get(0) {
Some(s) => s.to_string(),
None => return Some(format!("Must specify a stage to show")),
};
let token = arguments.get(1).map(|s| s.to_string());
repl.options.debug_asks.retain(|ask| match ask {
DebugAsk::ByStage { stage_name, .. } if *stage_name == this_stage_name => false,
_ => true
});
let ask = DebugAsk::ByStage { stage_name: this_stage_name, token };
repl.options.debug_asks.insert(ask);
None
},
Hide => {
let stage_name_to_remove = match arguments.get(0) {
Some(s) => s.to_string(),
None => return Some(format!("Must specify a stage to hide")),
};
repl.options.debug_asks.retain(|ask| match ask {
DebugAsk::ByStage { stage_name, .. } if *stage_name == stage_name_to_remove => false,
_ => true
});
None
},
TotalTimeOff => total_time_off(repl, arguments),
TotalTimeOn => total_time_on(repl, arguments),
StageTimeOff => stage_time_off(repl, arguments),
StageTimeOn => stage_time_on(repl, arguments),
Doc => doc(repl, arguments),
}
}
}
fn total_time_on(repl: &mut Repl, _: &[&str]) -> InterpreterDirectiveOutput {
repl.options.show_total_time = true;
None
}
fn total_time_off(repl: &mut Repl, _: &[&str]) -> InterpreterDirectiveOutput {
repl.options.show_total_time = false;
None
}
fn stage_time_on(repl: &mut Repl, _: &[&str]) -> InterpreterDirectiveOutput {
repl.options.show_stage_times = true;
None
}
fn stage_time_off(repl: &mut Repl, _: &[&str]) -> InterpreterDirectiveOutput {
repl.options.show_stage_times = false;
None
}
fn doc(repl: &mut Repl, arguments: &[&str]) -> InterpreterDirectiveOutput {
arguments.get(0).map(|cmd| {
let source = cmd.to_string();
let meta = LangMetaRequest::Docs { source };
let cur_state = repl.get_cur_language_state();
match cur_state.request_meta(meta) {
LangMetaResponse::Docs { doc_string } => Some(doc_string),
_ => Some(format!("Invalid doc response"))
}
}).unwrap_or(Some(format!(":docs needs an argument")))
}

View File

@@ -1,55 +0,0 @@
use crate::repl::command_tree::CommandTree;
use crate::repl::directive_actions::DirectiveAction;
pub fn directives_from_pass_names(pass_names: &Vec<String>) -> CommandTree {
let passes_directives: Vec<CommandTree> = pass_names.iter()
.map(|pass_name| {
if pass_name == "parsing" {
CommandTree::nonterm(pass_name, None, vec![
CommandTree::nonterm_no_further_tab_completions("compact", None),
CommandTree::nonterm_no_further_tab_completions("expanded", None),
CommandTree::nonterm_no_further_tab_completions("trace", None),
])
} else {
CommandTree::nonterm_no_further_tab_completions(pass_name, None)
}
})
.collect();
CommandTree::Top(get_list(&passes_directives, true))
}
fn get_list(passes_directives: &Vec<CommandTree>, include_help: bool) -> Vec<CommandTree> {
use DirectiveAction::*;
vec![
CommandTree::terminal("exit", Some("exit the REPL"), vec![], QuitProgram),
CommandTree::terminal("quit", Some("exit the REPL"), vec![], QuitProgram),
CommandTree::terminal("help", Some("Print this help message"), if include_help { get_list(passes_directives, false) } else { vec![] }, Help),
CommandTree::nonterm("debug",
Some("Configure debug information"),
vec![
CommandTree::terminal("list-passes", Some("List all registered compiler passes"), vec![], ListPasses),
CommandTree::terminal("show-immediate", None, passes_directives.clone(), ShowImmediate),
CommandTree::terminal("show", Some("Show debug output for a specific pass"), passes_directives.clone(), Show),
CommandTree::terminal("hide", Some("Hide debug output for a specific pass"), passes_directives.clone(), Hide),
CommandTree::nonterm("total-time", None, vec![
CommandTree::terminal("on", None, vec![], TotalTimeOn),
CommandTree::terminal("off", None, vec![], TotalTimeOff),
]),
CommandTree::nonterm("stage-times", Some("Computation time per-stage"), vec![
CommandTree::terminal("on", None, vec![], StageTimeOn),
CommandTree::terminal("off", None, vec![], StageTimeOff),
])
]
),
CommandTree::nonterm("lang",
Some("switch between languages, or go directly to a langauge by name"),
vec![
CommandTree::nonterm_no_further_tab_completions("next", None),
CommandTree::nonterm_no_further_tab_completions("prev", None),
CommandTree::nonterm("go", None, vec![]),
]
),
CommandTree::terminal("doc", Some("Get language-specific help for an item"), vec![], Doc),
]
}

View File

@@ -1,54 +0,0 @@
use std::fmt::Write as FmtWrite;
use colored::*;
use super::command_tree::CommandTree;
use super::{Repl, InterpreterDirectiveOutput};
pub fn help(repl: &mut Repl, arguments: &[&str]) -> InterpreterDirectiveOutput {
match arguments {
[] => return global_help(repl),
commands => {
let dirs = repl.get_directives();
Some(match get_directive_from_commands(commands, &dirs) {
None => format!("Directive `{}` not found", commands.last().unwrap()),
Some(dir) => {
let mut buf = String::new();
writeln!(buf, "`{}` - {}", dir.get_cmd(), dir.get_help()).unwrap();
buf
}
})
}
}
}
fn get_directive_from_commands<'a>(commands: &[&str], dirs: &'a CommandTree) -> Option<&'a CommandTree> {
let mut directive_list = dirs.get_children();
let mut matched_directive = None;
for cmd in commands {
let found = directive_list.iter().find(|directive| directive.get_cmd() == *cmd);
if let Some(dir) = found {
directive_list = dir.get_children();
}
matched_directive = found;
}
matched_directive
}
fn global_help(repl: &mut Repl) -> InterpreterDirectiveOutput {
let mut buf = String::new();
let sigil = repl.interpreter_directive_sigil;
writeln!(buf, "{} version {}", "Schala REPL".bright_red().bold(), crate::VERSION_STRING).unwrap();
writeln!(buf, "-----------------------").unwrap();
for directive in repl.get_directives().get_children() {
writeln!(buf, "{}{} - {}", sigil, directive.get_cmd(), directive.get_help()).unwrap();
}
let ref lang = repl.get_cur_language_state();
writeln!(buf, "").unwrap();
writeln!(buf, "Language-specific help for {}", lang.get_language_name()).unwrap();
writeln!(buf, "-----------------------").unwrap();
Some(buf)
}

View File

@@ -1,209 +0,0 @@
use std::sync::Arc;
use std::collections::HashSet;
use crate::language::{ProgrammingLanguageInterface,
ComputationRequest, LangMetaResponse, LangMetaRequest};
mod command_tree;
use self::command_tree::CommandTree;
mod repl_options;
use repl_options::ReplOptions;
mod directive_actions;
mod directives;
use directives::directives_from_pass_names;
mod help;
mod response;
use response::ReplResponse;
const HISTORY_SAVE_FILE: &'static str = ".schala_history";
const OPTIONS_SAVE_FILE: &'static str = ".schala_repl";
type InterpreterDirectiveOutput = Option<String>;
pub struct Repl {
pub interpreter_directive_sigil: char,
line_reader: ::linefeed::interface::Interface<::linefeed::terminal::DefaultTerminal>,
language_states: Vec<Box<dyn ProgrammingLanguageInterface>>,
options: ReplOptions,
}
impl Repl {
pub fn new(initial_states: Vec<Box<dyn ProgrammingLanguageInterface>>) -> Repl {
use linefeed::Interface;
let line_reader = Interface::new("schala-repl").unwrap();
let interpreter_directive_sigil = ':';
Repl {
interpreter_directive_sigil,
line_reader,
language_states: initial_states,
options: ReplOptions::new(),
}
}
pub fn run_repl(&mut self) {
println!("Schala MetaInterpreter version {}", crate::VERSION_STRING);
println!("Type {}help for help with the REPL", self.interpreter_directive_sigil);
self.load_options();
self.handle_repl_loop();
self.save_before_exit();
println!("Exiting...");
}
fn load_options(&mut self) {
self.line_reader.load_history(HISTORY_SAVE_FILE).unwrap_or(());
match ReplOptions::load_from_file(OPTIONS_SAVE_FILE) {
Ok(options) => {
self.options = options;
},
Err(()) => ()
};
}
fn handle_repl_loop(&mut self) {
use linefeed::ReadResult::*;
loop {
self.update_line_reader();
match self.line_reader.read_line() {
Err(e) => {
println!("readline IO Error: {}", e);
break;
},
Ok(Eof) | Ok(Signal(_)) => break,
Ok(Input(ref input)) => {
self.line_reader.add_history_unique(input.to_string());
match input.chars().nth(0) {
Some(ch) if ch == self.interpreter_directive_sigil => match self.handle_interpreter_directive(input) {
Some(directive_output) => println!("<> {}", directive_output),
None => (),
},
_ => {
for repl_response in self.handle_input(input) {
println!("{}", repl_response);
}
}
}
}
}
}
}
fn update_line_reader(&mut self) {
let tab_complete_handler = TabCompleteHandler::new(self.interpreter_directive_sigil, self.get_directives());
self.line_reader.set_completer(Arc::new(tab_complete_handler)); //TODO fix this here
let prompt_str = format!(">> ");
self.line_reader.set_prompt(&prompt_str).unwrap();
}
fn save_before_exit(&self) {
self.line_reader.save_history(HISTORY_SAVE_FILE).unwrap_or(());
self.options.save_to_file(OPTIONS_SAVE_FILE);
}
fn handle_interpreter_directive(&mut self, input: &str) -> InterpreterDirectiveOutput {
let mut iter = input.chars();
iter.next();
let arguments: Vec<&str> = iter
.as_str()
.split_whitespace()
.collect();
if arguments.len() < 1 {
return None;
}
let directives = self.get_directives();
directives.perform(self, &arguments)
}
fn get_cur_language_state(&mut self) -> &mut Box<dyn ProgrammingLanguageInterface> {
//TODO this is obviously not complete
&mut self.language_states[0]
}
fn handle_input(&mut self, input: &str) -> Vec<ReplResponse> {
let mut debug_requests = HashSet::new();
for ask in self.options.debug_asks.iter() {
debug_requests.insert(ask.clone());
}
let request = ComputationRequest { source: input, debug_requests };
let ref mut language_state = self.get_cur_language_state();
let response = language_state.run_computation(request);
response::handle_computation_response(response, &self.options)
}
fn get_directives(&mut self) -> CommandTree {
let language_state = self.get_cur_language_state();
let pass_names = match language_state.request_meta(LangMetaRequest::StageNames) {
LangMetaResponse::StageNames(names) => names,
_ => vec![],
};
directives_from_pass_names(&pass_names)
}
}
struct TabCompleteHandler {
sigil: char,
top_level_commands: CommandTree,
}
use linefeed::complete::{Completion, Completer};
use linefeed::terminal::Terminal;
impl TabCompleteHandler {
fn new(sigil: char, top_level_commands: CommandTree) -> TabCompleteHandler {
TabCompleteHandler {
top_level_commands,
sigil,
}
}
}
impl<T: Terminal> Completer<T> for TabCompleteHandler {
fn complete(&self, word: &str, prompter: &::linefeed::prompter::Prompter<T>, start: usize, _end: usize) -> Option<Vec<Completion>> {
let line = prompter.buffer();
if !line.starts_with(self.sigil) {
return None;
}
let mut words = line[1..(if start == 0 { 1 } else { start })].split_whitespace();
let mut completions = Vec::new();
let mut command_tree: Option<&CommandTree> = Some(&self.top_level_commands);
loop {
match words.next() {
None => {
let top = match command_tree {
Some(CommandTree::Top(_)) => true,
_ => false
};
let word = if top { word.get(1..).unwrap() } else { word };
for cmd in command_tree.map(|x| x.get_subcommands()).unwrap_or(vec![]).into_iter() {
if cmd.starts_with(word) {
completions.push(Completion {
completion: format!("{}{}", if top { ":" } else { "" }, cmd),
display: Some(cmd.to_string()),
suffix: ::linefeed::complete::Suffix::Some(' ')
})
}
}
break;
},
Some(s) => {
let new_ptr: Option<&CommandTree> = command_tree.and_then(|cm| match cm {
CommandTree::Top(children) => children.iter().find(|c| c.get_cmd() == s),
CommandTree::NonTerminal { children, .. } => children.iter().find(|c| c.get_cmd() == s),
CommandTree::Terminal { children, .. } => children.iter().find(|c| c.get_cmd() == s),
});
command_tree = new_ptr;
}
}
}
Some(completions)
}
}

View File

@@ -1,47 +0,0 @@
use crate::language::DebugAsk;
use std::io::{Read, Write};
use std::collections::HashSet;
use std::fs::File;
#[derive(Serialize, Deserialize)]
pub struct ReplOptions {
pub debug_asks: HashSet<DebugAsk>,
pub show_total_time: bool,
pub show_stage_times: bool,
}
impl ReplOptions {
pub fn new() -> ReplOptions {
ReplOptions {
debug_asks: HashSet::new(),
show_total_time: true,
show_stage_times: false,
}
}
pub fn save_to_file(&self, filename: &str) {
let res = File::create(filename)
.and_then(|mut file| {
let buf = crate::serde_json::to_string(self).unwrap();
file.write_all(buf.as_bytes())
});
if let Err(err) = res {
println!("Error saving {} file {}", filename, err);
}
}
pub fn load_from_file(filename: &str) -> Result<ReplOptions, ()> {
File::open(filename)
.and_then(|mut file| {
let mut contents = String::new();
file.read_to_string(&mut contents)?;
Ok(contents)
})
.and_then(|contents| {
let output: ReplOptions = crate::serde_json::from_str(&contents)?;
Ok(output)
})
.map_err(|_| ())
}
}

View File

@@ -1,67 +0,0 @@
use colored::*;
use std::fmt;
use std::fmt::Write;
use super::ReplOptions;
use crate::language::{ DebugAsk, ComputationResponse};
pub struct ReplResponse {
label: Option<String>,
text: String,
color: Option<Color>
}
impl fmt::Display for ReplResponse {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
let mut buf = String::new();
if let Some(ref label) = self.label {
write!(buf, "({})", label).unwrap();
}
write!(buf, "=> {}", self.text).unwrap();
write!(f, "{}", match self.color {
Some(c) => buf.color(c),
None => buf.normal()
})
}
}
pub fn handle_computation_response(response: ComputationResponse, options: &ReplOptions) -> Vec<ReplResponse> {
let mut responses = vec![];
if options.show_total_time {
responses.push(ReplResponse {
label: Some("Total time".to_string()),
text: format!("{:?}", response.global_output_stats.total_duration),
color: None,
});
}
if options.show_stage_times {
responses.push(ReplResponse {
label: Some("Stage times".to_string()),
text: format!("{:?}", response.global_output_stats.stage_durations),
color: None,
});
}
for debug_resp in response.debug_responses {
let stage_name = match debug_resp.ask {
DebugAsk::ByStage { stage_name, .. } => stage_name,
_ => continue,
};
responses.push(ReplResponse {
label: Some(stage_name.to_string()),
text: debug_resp.value,
color: Some(Color::Red),
});
}
responses.push(match response.main_output {
Ok(s) => ReplResponse { label: None, text: s, color: None },
Err(e) => ReplResponse { label: Some("Error".to_string()), text: e, color: Some(Color::Red) },
});
responses
}

44
schala-repl/src/webapp.rs Normal file
View File

@@ -0,0 +1,44 @@
use rocket;
use rocket::State;
use rocket::response::Content;
use rocket::http::ContentType;
use rocket_contrib::Json;
use language::{ProgrammingLanguageInterface, EvalOptions};
use WEBFILES;
use ::PLIGenerator;
#[get("/")]
fn index() -> Content<String> {
let path = "static/index.html";
let html_contents = String::from_utf8(WEBFILES.get(path).unwrap().into_owned()).unwrap();
Content(ContentType::HTML, html_contents)
}
#[get("/bundle.js")]
fn js_bundle() -> Content<String> {
let path = "static/bundle.js";
let js_contents = String::from_utf8(WEBFILES.get(path).unwrap().into_owned()).unwrap();
Content(ContentType::JavaScript, js_contents)
}
#[derive(Debug, Serialize, Deserialize)]
struct Input {
source: String,
}
#[derive(Serialize, Deserialize)]
struct Output {
text: String,
}
#[post("/input", format = "application/json", data = "<input>")]
fn interpreter_input(input: Json<Input>, generators: State<Vec<PLIGenerator>>) -> Json<Output> {
let schala_gen = generators.get(0).unwrap();
let mut schala: Box<ProgrammingLanguageInterface> = schala_gen();
let code_output = schala.execute_pipeline(&input.source, &EvalOptions::default());
Json(Output { text: code_output.to_repl() })
}
pub fn web_main(language_generators: Vec<PLIGenerator>) {
rocket::ignite().manage(language_generators).mount("/", routes![index, js_bundle, interpreter_input]).launch();
}

View File

@@ -1,15 +1,20 @@
extern crate schala_repl;
//extern crate maaru_lang;
//extern crate rukka_lang;
//extern crate robo_lang;
extern crate maaru_lang;
extern crate rukka_lang;
extern crate robo_lang;
extern crate schala_lang;
use schala_repl::{ProgrammingLanguageInterface, start_repl};
use schala_repl::{PLIGenerator, repl_main};
extern { }
fn main() {
let langs: Vec<Box<dyn ProgrammingLanguageInterface>> = vec![Box::new(schala_lang::Schala::new())];
start_repl(langs);
let generators: Vec<PLIGenerator> = vec![
Box::new(|| { Box::new(schala_lang::Schala::new())}),
Box::new(|| { Box::new(maaru_lang::Maaru::new())}),
Box::new(|| { Box::new(robo_lang::Robo::new())}),
Box::new(|| { Box::new(rukka_lang::Rukka::new())}),
];
repl_main(generators);
}