Commit ab773490 authored by Jens Korinth's avatar Jens Korinth

Squashed commit of the following:

commit e1683d56027c71d4fbf2ba2543a528da1b44285d
Author: Jens Korinth <jkorinth@gmx.net>
Date:   Thu Jul 13 17:36:59 2017 +0200

    Replace old Feature implementation

    * old implementation was "fat", i.e., each Feature had to have its own
      class on the Scala side of things
    * knowledge about validity is on Tcl side; led to double effort to check
      for invalid values, which went out-of-sync
    * removed fat layer, replaced by thin, generic implementation:
      Features have a name and a map of properties, mapping strings to
      strings; all properties are passed exactly as-is to Tcl
    * special case: added "Enabled" -> "true" in all cases, reasoning: if
      somebody goes to all the trouble of defining a Feature, it should
      usually be enabled
    * can be overridden explicitly (for whatever reason)

commit b559df40a0ad697e4d1b60f8c4ee927cae44c4ec
Author: Jens Korinth <jkorinth@gmx.net>
Date:   Thu Jul 13 16:45:15 2017 +0200

    Fix bug in path parser, fix test cases

commit 0ee9f090705d1ffd4b36d5c26b9a1978c57fab6a
Author: Jens Korinth <jkorinth@gmx.net>
Date:   Thu Jul 13 15:58:54 2017 +0200

    Fix bug in HLS gens

    * 'all' must be used _instead_ of kernel list, not within

commit 7b67ee180af8f53276b58c9697b8f798469b2afc
Author: Jens Korinth <jkorinth@gmx.net>
Date:   Thu Jul 13 15:51:39 2017 +0200

    Update usage info and man pages

    * man page is generated by Usage; not as nicely formatted, but readable
    * simplifies maintenance by keeping information in one place only

commit 7cffc777da2db55f74e23eb1546f597a881de9df
Author: Jens Korinth <jkorinth@gmx.net>
Date:   Thu Jul 13 12:11:50 2017 +0200

    Implement missing --logFile option

    * added to property checks as well

commit bfa6d8fc173c4edd58c78c12f749ebfc2947ee4e
Author: Jens Korinth <jkorinth@gmx.net>
Date:   Thu Jul 13 12:01:08 2017 +0200

    Increase the number of worker threads to 500

commit 884d3f5ffd5d3fd0890f07650cabc4a482ad4f1b
Author: Jens Korinth <jkorinth@gmx.net>
Date:   Thu Jul 13 11:59:30 2017 +0200

    Improve error messages from parser

    * mostly fine-tuning to make sure the lastParser value gives the user
      the right idea about the mistake
    * see also the script parserTest.sh in Seafile/TAPASCO to generate a lot
      of error messages automatically

commit 28215f874dab107810344584b98267809eb4dc4d
Author: Jens Korinth <jkorinth@gmx.net>
Date:   Wed Jul 12 19:27:53 2017 +0200

    Dump example jobs, if none are specified

    * Configuration dumping should automatically use JobExamples to generate
      a list with on instance of each job
    * useful for users to get a starting point for their configs
    * updated README.md accordingly

commit 72969fcbcb6ff5583c368969fa6d59cc5d290756
Author: Jens Korinth <jkorinth@gmx.net>
Date:   Wed Jul 12 19:22:01 2017 +0200

    Finish work on new parser and property tests

    * all parsers implemented, spec'ed and debugged
    * wrote a lot of property tests, caught some bugs I'd never have found
      otherwise, very nice
    * changed a few bits and pieces, need to rewrite man and usage

commit 786596a793a8d0510465438e77a5067f5d1bd67b
Author: Jens Korinth <jkorinth@gmx.net>
Date:   Tue Jul 11 18:49:36 2017 +0200

    Bugfix exceptions in MemInfo when /proc/meminfo is unavailable

    * not portable, sometimes /proc/meminfo may not exist (e.g., Darwin)
    * will warn once, then deactivate all resource checks on mem

commit e029dc27619de33477e3a3c156a91e18bb1f7eb7
Author: Jens Korinth <jkorinth@gmx.net>
Date:   Tue Jul 11 18:49:08 2017 +0200

    Bugfix for setup.sh on Darwin

commit b53bbbc225eaf0b1c31bac2e20299e07b9e00fcf
Author: Jens Korinth <jkorinth@gmx.net>
Date:   Tue Jul 11 18:47:42 2017 +0200

    Continue work on new parser

    * split classes into more sub objects
    * each subobject needs its own test suite with at least one test for
      each parser defined therein
    * got better with scalacheck, tests are stronger

commit 6b3b2504d31e1f673f4336901bec18203e3f2b3d
Author: Jens Korinth <jkorinth@gmx.net>
Date:   Sun Jul 9 18:34:01 2017 +0200

    Added unit tests and cleaned up a little

commit 8b12879df3ed59803a66ee7a97d006bb20982056
Author: Jens Korinth <jkorinth@gmx.net>
Date:   Sun Jul 9 18:33:00 2017 +0200

    Add missing --parallel in man page

commit 10d8b3a02bb8b02fc4ac287aa9e6fa6d4dece51e
Author: Jens Korinth <jkorinth@gmx.net>
Date:   Sat Jul 8 23:34:22 2017 +0200

    Implement new parser with better error handling

    * error messages of the current command line argument parsrer are
      abysmal, extremely confusing to users
    * since scala parser combinators do not feature cuts, it is very
      difficult to provide better error messages
    * started re-implementation of parser using 'fastparse'
    * works extremely well so far
parent d47c6666
......@@ -41,8 +41,12 @@ Basic Setup
You need to do this every time you use TPC (or put it into your `~/.bashrc`).
2. Build TPC: `sbt compile` (this may take a while, `sbt` needs to fetch all
dependencies etc. once).
2. Create the necessary jar files with 'sbt assembly'.
2. Create the necessary jar files with `sbt assembly`.
4. Run TPC unit tests: `sbt test`
5. _Optional_: Generate sample configuration file: `tapasco -n config.json`
TaPaSCo should exit immediately and `config.json` will include a full
configuration that can be read with `--configFile`, including one example
for each kind of job.
When the tests complete successfully, TPC is ready to use.
When everything completed successfully, TaPaSCo is ready to use!
Read on in [Getting Started](GETTINGSTARTED.md).
......@@ -23,7 +23,8 @@ libraryDependencies ++= Seq(
"com.google.guava" % "guava" % "19.0",
"com.google.code.findbugs" % "jsr305" % "3.0.1",
"org.scalatest" %% "scalatest" % "3.0.3" % "test",
"org.scalacheck" %% "scalacheck" % "1.13.5" % "test"
"org.scalacheck" %% "scalacheck" % "1.13.5" % "test",
"com.lihaoyi" %% "fastparse" % "0.4.3"
)
scalacOptions ++= Seq(
......
......@@ -11,9 +11,11 @@
"Features" : [
{
"Feature": "Cache",
"Enabled" : true,
"Size" : 32768,
"Associativity": 2
"Properties": {
"Enabled" : "true",
"Size" : "32768",
"Associativity": "2"
}
}
]
}
......
......@@ -11,11 +11,13 @@
"Features" : [
{
"Feature": "Debug",
"Enabled" : true,
"Depth" : 4096,
"Stages" : 1,
"Use Defaults" : false,
"Nets" : ["*interrupt", "*HP0*", "*GP*"]
"Properties": {
"Enabled" : "true",
"Depth" : "4096",
"Stages" : "1",
"Use Defaults" : "false",
"Nets" : "[list *interrupt *HP0* *GP]"
}
}
]
}
......
......@@ -8,7 +8,7 @@
"Composition" : {
"Composition":[{"Kernel": "Kern1", "Count": 1}]
},
"Features" : [ { "Feature": "Debug", "Enabled" : false } ]
"Features" : [ { "Feature": "Debug", "Properties": { "Enabled" : "false" } } ]
}
]
}
{
"Jobs": [
{
"Job": "Compose",
"Design Frequency": 100,
"Platforms" : ["Plat1", "Plat2"],
"Architectures" : ["Arch1", "Arch3"],
"Composition" : {
"Composition":[{"Kernel": "Kern1", "Count": 1}]
},
"Features" : [
{
"Feature": "Cache",
"Enabled" : true,
"Size" : 262144,
"Associativity": 8
}
]
}
]
}
{
"Jobs": [
{
"Job": "Compose",
"Design Frequency": 100,
"Platforms" : ["Plat1", "Plat2"],
"Architectures" : ["Arch1", "Arch3"],
"Composition" : {
"Composition":[{"Kernel": "Kern1", "Count": 1}]
},
"Features" : [
{
"Feature": "Cache",
"Enabled" : true,
"Size" : 1048576,
"Associativity": 2
}
]
}
]
}
{
"Jobs": [
{
"Job": "Compose",
"Design Frequency": 100,
"Platforms" : ["Plat1", "Plat2"],
"Architectures" : ["Arch1", "Arch3"],
"Composition" : {
"Composition":[{"Kernel": "Kern1", "Count": 1}]
},
"Features" : [
{
"Feature": "Debug",
"Enabled" : true,
"Depth" : 3,
"Stages" : 1
}
]
}
]
}
{
"Jobs": [
{
"Job": "Compose",
"Design Frequency": 100,
"Platforms" : ["Plat1", "Plat2"],
"Architectures" : ["Arch1", "Arch3"],
"Composition" : {
"Composition":[{"Kernel": "Kern1", "Count": 1}]
},
"Features" : [
{
"Feature": "Debug",
"Enabled" : true,
"Depth" : 4096,
"Stages" : 1000
}
]
}
]
}
......@@ -11,7 +11,7 @@
"Features" : [
{
"Feature": "LED",
"Enabled": true
"Properties": { "Enabled": "true" }
}
]
}
......
......@@ -11,7 +11,7 @@
"Features" : [
{
"Feature": "OLED",
"Enabled": true
"Properties": { "Enabled": "true" }
}
]
}
......
.TH itapasco 1 "May 11, 2017" "version 2017.1" "USER COMMANDS"
.TH itapasco 1 "July 13, 2017" "version 2017.1" "USER COMMANDS"
.SH NAME
itapasco \- The Interactive Task Parallel System Composer
.SH SYNOPSIS
.B itapasco
[global options]* [jobs]*
[global option]* [job]*
.SH DESCRIPTION
A Swing-based GUI for tapasco. See tapasco(1) for details on the options.
.SH ENVIRONMENT
......
This diff is collapsed.
if [ -n "$BASH_VERSION" ]; then
export TAPASCO_HOME=`dirname ${BASH_SOURCE[0]} | xargs realpath`
if [ "`uname`" = "Darwin" ]; then
export TAPASCO_HOME=`dirname ${BASH_SOURCE[0]} | xargs cd | pwd`
else
export TAPASCO_HOME=`dirname ${BASH_SOURCE[0]} | xargs realpath`
fi
elif [ -n "$ZSH_VERSION" ]; then
export TAPASCO_HOME=`dirname ${(%):-%x} | xargs realpath`
else
......
......@@ -25,7 +25,6 @@ import parser._
import slurm._
import java.nio.file.Path
import scala.concurrent._
//import scala.concurrent.ExecutionContext.Implicits.global
object Tapasco {
import org.slf4j.LoggerFactory
......@@ -58,19 +57,25 @@ object Tapasco {
(firstArg.toLowerCase equals "itapasco") && { new AppController(Some(cfg)).show; true }
} getOrElse false
private def dryRun(p: Path)(implicit cfg: Configuration) {
import base.json._
logger.info("dry run, dumping configuration to {}", p)
Configuration.to(if (cfg.jobs.isEmpty) cfg.jobs(jobs.JobExamples.jobs) else cfg, p)
System.exit(0)
}
def main(args: Array[String]) {
implicit val tasks = new Tasks
val ok = try {
// try to parse all arguments
val c = CommandLineParser(args mkString " ") match {
// if that fails, check if special command was given as first parameter
case Left(ex) => CommandLineParser(args.tail mkString " ")
case r => r
}
val c = CommandLineParser(args mkString " ")
logger.debug("parsed config: {}", c)
if (c.isRight) {
// get parsed Configuration
implicit val cfg = c.right.get
// dump config and exit, if dryRun is selected
cfg.dryRun foreach (dryRun _)
// else continue ...
logger.trace("configuring FileAssetManager...")
FileAssetManager(cfg)
logger.trace("SLURM: {}", cfg.slurm)
......@@ -79,7 +84,7 @@ object Tapasco {
logger.trace("parallel: {}", cfg.parallel)
cfg.logFile map { logfile: Path => setupLogFileAppender(logfile.toString) }
logger.info("Running with configuration: {}", cfg.toString)
implicit val exe = ExecutionContext.fromExecutor(new java.util.concurrent.ForkJoinPool(250))
implicit val exe = ExecutionContext.fromExecutor(new java.util.concurrent.ForkJoinPool(500))
def get(f: Future[Boolean]): Boolean = { Await.ready(f, duration.Duration.Inf); f.value map (_ getOrElse false) getOrElse false }
if (cfg.parallel)
runGui(args) || (cfg.jobs map { j => Future { jobs.executors.execute(j) } } map (get _) fold true) (_ && _)
......@@ -87,7 +92,7 @@ object Tapasco {
runGui(args) || (cfg.jobs map { jobs.executors.execute(_) } fold true) (_ && _)
} else {
logger.error("invalid arguments: {}", c.left.get.toString)
logger.error(Usage())
logger.error("run `tapasco -h` or `tapasco --help` to get more info")
false
}
} catch { case ex: Exception =>
......
......@@ -30,7 +30,6 @@ import de.tu_darmstadt.cs.esa.tapasco.base._
import de.tu_darmstadt.cs.esa.tapasco.base.json._
import de.tu_darmstadt.cs.esa.tapasco.util._
import de.tu_darmstadt.cs.esa.tapasco.filemgmt.FileAssetManager
import scala.sys.process._
import java.nio.file._
/**
......
......@@ -50,6 +50,8 @@ trait Configuration {
def parallel(enabled: Boolean): Configuration
def maxThreads: Option[Int]
def maxThreads(mt: Option[Int]): Configuration
def dryRun(cfg: Option[Path]): Configuration
def dryRun: Option[Path]
/** Returns the default output directory for the given kernel and target. */
def outputDir(kernel: Kernel, target: Target): Path =
......@@ -64,7 +66,7 @@ trait Configuration {
.resolve(target.pd.name)
.resolve(composition.composition map (_.kernel.replaceAll(" ", "_")) mkString "__")
.resolve(composition.composition map (ce => "%03d".format(ce.count)) mkString "_")
.resolve("%05.1f%s".format(freq, (features filter (_.enabled) map ("+" + _.shortName)).sorted mkString ""))
.resolve("%05.1f%s".format(freq, (features map ("+" + _.name)).sorted mkString ""))
/** Returns the default output directory for the given core and target. */
def outputDir(core: Core, target: Target): Path =
......
......@@ -44,6 +44,7 @@ private case class ConfigurationImpl (
slurm: Boolean = false,
parallel: Boolean = false,
maxThreads: Option[Int] = None,
dryRun: Option[Path] = None,
jobs: Seq[Job] = Seq()
) extends Description(descPath: Path) with Configuration {
def descPath(p: Path): Configuration = this.copy(descPath = p)
......@@ -62,6 +63,7 @@ private case class ConfigurationImpl (
def slurm(enabled: Boolean): Configuration = this.copy(slurm = enabled)
def parallel(enabled: Boolean): Configuration = this.copy(parallel = enabled)
def maxThreads(mt: Option[Int]): Configuration = this.copy(maxThreads = mt)
def dryRun(cfg: Option[Path]): Configuration = this.copy(dryRun = cfg)
def jobs(js: Seq[Job]): Configuration = this.copy(jobs = js)
// these directories must exist
......
......@@ -23,49 +23,19 @@
**/
package de.tu_darmstadt.cs.esa.tapasco.base
sealed abstract class Feature(val enabled: Boolean) {
def shortName: String = this.getClass.getSimpleName
sealed class Feature(val name: String, val props: Map[String, String]) {
def unapply: Option[(String, Map[String, String])] = Some((name, props))
override def equals(o: Any): Boolean = o match {
case Feature(n, p) => name.equals(n) && props.equals(p)
case _ => false
}
}
// scalastyle:off magic.number
object Feature {
final case class LED(override val enabled: Boolean) extends Feature(enabled)
final case class OLED(override val enabled: Boolean) extends Feature(enabled)
final case class Cache(override val enabled: Boolean, size: Int, associativity: Int) extends Feature(enabled) {
private val logger = de.tu_darmstadt.cs.esa.tapasco.Logging.logger(getClass)
private def cacheSizeSupported(n: Int): Boolean = {
val supportedSizes = List(32768, 65536, 131072, 262144, 524288)
val ok = supportedSizes.contains(n)
if (! ok) {
logger.warn("Cache size " + n + " is not supported, " +
"ignoring cache configuration. Supported sizes: " + supportedSizes)
}
ok
}
override def shortName: String = "Cache_%d,%d".format(size, associativity)
require (cacheSizeSupported(size), "cache size %d is not supported".format(size))
}
final case class Debug(override val enabled: Boolean, depth: Option[Int], stages: Option[Int],
useDefaults: Option[Boolean], nets: Option[Seq[String]]) extends Feature(enabled) {
private val logger = de.tu_darmstadt.cs.esa.tapasco.Logging.logger(getClass)
private def dataDepthSupported(n: Int): Boolean = {
val supportedDepths = List(1024, 2048, 4096, 8192, 16384)
val ok = supportedDepths.contains(n)
if (! ok) {
logger.warn("Debug core data depth " + n + " is not supported, " +
"ignoring debug configuration. Supported sizes: " + supportedDepths)
}
ok
}
private def stagesSupported(n: Int): Boolean = n >= 0 && n <= 6
def apply(name: String, props: Map[String, String]): Feature = new Feature(
name,
if (props.get("Enabled").nonEmpty) props else props + ("Enabled" -> "true")
)
depth foreach { d => require(dataDepthSupported(d), "data depth %d not supported".format(d)) }
stages foreach { s => require(stagesSupported(s), "%d stages not supported".format(s)) }
}
final case class BlueDma(override val enabled: Boolean) extends Feature(enabled)
final case class AtsPri(override val enabled: Boolean) extends Feature(enabled)
def unapply(f: Feature): Option[(String, Map[String, String])] = f.unapply
}
// scalastyle:on magic.number
package de.tu_darmstadt.cs.esa.tapasco.base
import de.tu_darmstadt.cs.esa.tapasco.Implicits._
import de.tu_darmstadt.cs.esa.tapasco.json._
import de.tu_darmstadt.cs.esa.tapasco.jobs._
import de.tu_darmstadt.cs.esa.tapasco.jobs.json._
......@@ -169,86 +168,15 @@ package object json {
/* Core @} */
/* @{ Features */
private val readsLEDFeature: Reads[Feature] = (
(JsPath \ "Feature").read[String] (verifying[String](_ equals "LED")) ~>
(JsPath \ "Enabled").readNullable[Boolean].map (_ getOrElse true)
) .fmap(Feature.LED.apply _)
private val writesLEDFeature: Writes[Feature.LED] = (
(JsPath \ "Feature").write[String] ~
(JsPath \ "Enabled").write[Boolean]
) (unlift(Feature.LED.unapply _ andThen (_ map (("LED", _)))))
private val readsOLEDFeature: Reads[Feature] = (
(JsPath \ "Feature").read[String] (verifying[String](_ equals "OLED")) ~>
(JsPath \ "Enabled").readNullable[Boolean].map (_ getOrElse true)
) .fmap(Feature.OLED.apply _)
private val writesOLEDFeature: Writes[Feature.OLED] = (
(JsPath \ "Feature").write[String] ~
(JsPath \ "Enabled").write[Boolean]
) (unlift(Feature.OLED.unapply _ andThen (_ map (("OLED", _)))))
private val readsCacheFeature: Reads[Feature] = (
(JsPath \ "Feature").read[String] (verifying[String](_ equals "Cache")) ~>
(JsPath \ "Enabled").readNullable[Boolean].map (_ getOrElse true) ~
(JsPath \ "Size").read[Int] ~
(JsPath \ "Associativity").read[Int] (verifying[Int](n => n == 2 || n == 4))
) (Feature.Cache.apply _)
private implicit val writesCacheFeature: Writes[Feature.Cache] = (
(JsPath \ "Feature").write[String] ~
(JsPath \ "Enabled").write[Boolean] ~
(JsPath \ "Size").write[Int] ~
(JsPath \ "Associativity").write[Int]
) (unlift(Feature.Cache.unapply _ andThen (_ map ("Cache" +: _))))
private val readsDebugFeature: Reads[Feature] = (
(JsPath \ "Feature").read[String] (verifying[String](_ equals "Debug")) ~>
(JsPath \ "Enabled").readNullable[Boolean].map (_ getOrElse true) ~
(JsPath \ "Depth").readNullable[Int] ~
(JsPath \ "Stages").readNullable[Int] ~
(JsPath \ "Use Defaults").readNullable[Boolean] ~
(JsPath \ "Nets").readNullable[Seq[String]]
) (Feature.Debug.apply _)
private implicit val writesDebugFeature: Writes[Feature.Debug] = (
(JsPath \ "Feature").write[String] ~
(JsPath \ "Enabled").write[Boolean] ~
(JsPath \ "Depth").writeNullable[Int] ~
(JsPath \ "Stages").writeNullable[Int] ~
(JsPath \ "Use Defaults").writeNullable[Boolean] ~
(JsPath \ "Nets").writeNullable[Seq[String]]
) (unlift(Feature.Debug.unapply _ andThen (_ map ("Debug" +: _))))
implicit val readsFeature: Reads[Feature] = (
(JsPath \ "Feature").read[String] ~
(JsPath \ "Properties").read[Map[String, String]]
) (Feature.apply _)
private val readsBlueDmaFeature: Reads[Feature] = (
(JsPath \ "Feature").read[String] (verifying[String](_ equals "BlueDMA")) ~>
(JsPath \ "Enabled").readNullable[Boolean].map (_ getOrElse true)
) .fmap(Feature.BlueDma.apply _)
private val writesBlueDmaFeature: Writes[Feature.BlueDma] = (
implicit val writesFeature: Writes[Feature] = (
(JsPath \ "Feature").write[String] ~
(JsPath \ "Enabled").write[Boolean]
) (unlift(Feature.BlueDma.unapply _ andThen (_ map (("BlueDMA", _)))))
private val readsAtsPriFeature: Reads[Feature] = (
(JsPath \ "Feature").read[String] (verifying[String](_ equals "ATS+PRI")) ~>
(JsPath \ "Enabled").readNullable[Boolean].map (_ getOrElse true)
) .fmap(Feature.AtsPri.apply _)
private val writesAtsPriFeature: Writes[Feature.AtsPri] = (
(JsPath \ "Feature").write[String] ~
(JsPath \ "Enabled").write[Boolean]
) (unlift(Feature.AtsPri.unapply _ andThen (_ map (("ATS+PRI", _)))))
implicit val readsFeature: Reads[Feature] =
readsLEDFeature | readsOLEDFeature | readsCacheFeature | readsDebugFeature |
readsBlueDmaFeature | readsAtsPriFeature
implicit object writesFeature extends Writes[Feature] {
def writes(f: Feature): JsValue = f match {
case f: Feature.LED => writesLEDFeature.writes(f)
case f: Feature.OLED => writesOLEDFeature.writes(f)
case f: Feature.Cache => writesCacheFeature.writes(f)
case f: Feature.Debug => writesDebugFeature.writes(f)
case f: Feature.BlueDma => writesBlueDmaFeature.writes(f)
case f: Feature.AtsPri => writesAtsPriFeature.writes(f)
}
}
(JsPath \ "Properties").write[Map[String, String]]
) (unlift(Feature.unapply _))
/* Features @} */
/* @{ Kernel.Argument */
......@@ -347,6 +275,7 @@ package object json {
(JsPath \ "Slurm").readNullable[Boolean].map (_ getOrElse false) ~
(JsPath \ "Parallel").readNullable[Boolean].map (_ getOrElse false) ~
(JsPath \ "MaxThreads").readNullable[Int] ~
(JsPath \ "DryRun").readNullable[Path] ~
(JsPath \ "Jobs").read[Seq[Job]]
) (ConfigurationImpl.apply _)
implicit private val configurationWrites: Writes[ConfigurationImpl] = (
......@@ -360,6 +289,7 @@ package object json {
(JsPath \ "Slurm").write[Boolean] ~
(JsPath \ "Parallel").write[Boolean] ~
(JsPath \ "MaxThreads").writeNullable[Int] ~
(JsPath \ "DryRun").writeNullable[Path].transform((js: JsObject) => js - "DryRun") ~
(JsPath \ "Jobs").write[Seq[Job]]
) (unlift(ConfigurationImpl.unapply _))
implicit object ConfigurationWrites extends Writes[Configuration] {
......
......@@ -23,36 +23,19 @@
**/
package de.tu_darmstadt.cs.esa.tapasco.base.tcl
import de.tu_darmstadt.cs.esa.tapasco.base.Feature
import de.tu_darmstadt.cs.esa.tapasco.base.Feature._
import scala.util.Properties.{lineSeparator => NL}
class FeatureTclPrinter(prefix: String) {
private val pre = "dict set " + prefix + "features "
private val pre = s"dict set ${prefix}features"
/** GenerateTcl commands to add feature to a Tcl dict.
* @param f Feature to add.
* @return String containing Tcls commands to write f into
* a dict called <prefix>features.
**/
def toTcl(f: Feature): String = f match {
case LED(enabled) => pre + "LED enabled " + enabled
case OLED(enabled) => pre + "OLED enabled " + enabled
case Cache(enabled, size, associativity) => Seq(
pre + "Cache enabled " + enabled,
pre + "Cache size " + size,
pre + "Cache associativity " + associativity).mkString(NL)
case Debug(enabled, depth, stages, useDefaults, nets) => Seq(
pre + "Debug enabled " + enabled,
if (depth.isEmpty) "" else pre + "Debug depth " + depth.get,
if (stages.isEmpty) "" else pre + "Debug stages " + stages.get,
if (useDefaults.isEmpty) "" else pre + "Debug use_defaults " + useDefaults.get,
if (nets.isEmpty) "" else pre + "Debug nets [list " + nets.get.map(n => "{" + n + "}").mkString(" ") + "] "
).mkString(NL)
case BlueDma(enabled) => pre + "BlueDMA enabled " + enabled
case AtsPri(enabled) => pre + "ATS-PRI enabled " + enabled
case _ => "unknown feature"
}
def toTcl(f: Feature): String = f.props map {
case (name, value) => s"$pre ${f.name} $name $value"
} mkString NL
def toTcl(fs: Seq[Feature]): String = fs.map(toTcl).mkString(NL)
}
......@@ -31,6 +31,11 @@ object Heuristics {
type Value = Double
abstract class Heuristic extends Function3[Composition, Frequency, Target, Configuration => Value]
def apply(name: String): Heuristic = name.toLowerCase match {
case "throughput" | "job throughput" => ThroughputHeuristic
case o => throw new Exception(s"unknown heuristic: '$o'")
}
object ThroughputHeuristic extends Heuristic {
private def findAverageClockCycles(kernel: String, target: Target)
(implicit cfg: Configuration): Int = {
......
......@@ -46,8 +46,8 @@ final case class ComposeJob(
composition: Composition,
designFrequency: Heuristics.Frequency,
private val _implementation: String,
private val _architectures: Option[Seq[String]],
private val _platforms: Option[Seq[String]],
private val _architectures: Option[Seq[String]] = None,
private val _platforms: Option[Seq[String]] = None,
features: Option[Seq[Feature]] = None,
debugMode: Option[String] = None) extends Job("compose") {
/** Returns the selected composer tool implementation. */
......@@ -77,9 +77,9 @@ final case class ComposeJob(
* @param _platforms Name list of [[base.Platform]] instances.
**/
final case class CoreStatisticsJob(
prefix: Option[String],
private val _architectures: Option[Seq[String]],
private val _platforms: Option[Seq[String]]) extends Job("corestats") {
prefix: Option[String] = None,
private val _architectures: Option[Seq[String]] = None,
private val _platforms: Option[Seq[String]] = None) extends Job("corestats") {
/** Returns the list of [[base.Architecture]] instances selected in this job. */
def architectures: Set[Architecture] =
FileAssetManager.entities.architectures filter (a => _architectures map (_.contains(a.name)) getOrElse true)
......@@ -121,10 +121,10 @@ final case class DesignSpaceExplorationJob(
dimensions: DesignSpace.Dimensions,
heuristic: Heuristics.Heuristic,
batchSize: Int,
basePath: Option[Path],
private val _architectures: Option[Seq[String]],
private val _platforms: Option[Seq[String]],
features: Option[Seq[Feature]],
basePath: Option[Path] = None,
private val _architectures: Option[Seq[String]] = None,
private val _platforms: Option[Seq[String]] = None,
features: Option[Seq[Feature]] = None,
debugMode: Option[String] = None) extends Job("dse") {
/** Returns the list of [[base.Architecture]] instances selected in this job. */
def architectures: Set[Architecture] =
......@@ -163,9 +163,9 @@ final case class DesignSpaceExplorationJob(
**/
final case class HighLevelSynthesisJob(
private val _implementation: String,
private val _architectures: Option[Seq[String]],
private val _platforms: Option[Seq[String]],
private val _kernels: Option[Seq[String]]) extends Job("hls") {
private val _architectures: Option[Seq[String]] = None,
private val _platforms: Option[Seq[String]] = None,
private val _kernels: Option[Seq[String]] = None) extends Job("hls") {
/** Returns the selected HLS tool implementation. */
lazy val implementation: HighLevelSynthesizer.Implementation = HighLevelSynthesizer.Implementation(_implementation)
......@@ -205,10 +205,10 @@ final case class HighLevelSynthesisJob(
final case class ImportJob(
zipFile: Path,
id: Kernel.Id,
description: Option[String],
averageClockCycles: Option[Int],
private val _architectures: Option[Seq[String]],
private val _platforms: Option[Seq[String]]) extends Job("import") {
description: Option[String] = None,
averageClockCycles: Option[Int] = None,
private val _architectures: Option[Seq[String]] = None,
private val _platforms: Option[Seq[String]] = None) extends Job("import") {
/** Returns the list of [[base.Architecture]] instances selected in this job. */
def architectures: Set[Architecture] =
FileAssetManager.entities.architectures filter (a => _architectures map (_.contains(a.name)) getOrElse true)
......
package de.tu_darmstadt.cs.esa.tapasco.parser
import de.tu_darmstadt.cs.esa.tapasco.dse.Heuristics.Frequency
import fastparse.all._
import java.nio.file._
import scala.language.implicitConversions
private object BasicParsers {
def longOption(name: String): Parser[String] = longOption(name, name)
def longOption(name: String, retVal: String, alternatives: String*) =
(name +: alternatives) map (n => IgnoreCase("--%s".format(n)).!.map(_ => retVal)) reduce (_|_)
def longShortOption(shortName: String, longName: String, retVal: Option[String] = None) =
IgnoreCase("-%s".format(shortName)).! | IgnoreCase("--%s".format(longName)).! map (retVal getOrElse _)
val argChars = "-"
val quoteChars = "\"'"
val seqSepChars = ";,:"
val whitespaceChars = " \n\t"
val specialChars = whitespaceChars ++ quoteChars ++ seqSepChars ++ argChars
val digitChars = '0' to '9'
val alphaChars = ('a' to 'z') ++ ('A' to 'Z')
val nonStringChars = whitespaceChars ++ quoteChars ++ seqSepChars
val ws = NoTrace(CharIn(whitespaceChars).rep.opaque("whitespace"))
val ws1 = NoTrace(CharIn(whitespaceChars).rep(1).opaque("whitespace"))
val seqSep = CharIn(seqSepChars)
val sep = ws ~ seqSep.opaque(s"list separator, one of $seqSepChars") ~ ws
val quote = CharIn(quoteChars).opaque(s"quote char, one of $quoteChars")
def string(exceptionChars: String): Parser[String] =
(CharPred(!(exceptionChars ++ nonStringChars).contains(_)).rep(1).!)
.opaque(s"string containing none of '$exceptionChars'")
def string(exceptionStrings: Seq[String]): Parser[String] =
// compute exception chars as first char in each exception string
string(exceptionStrings filter (_.nonEmpty) map (_.apply(0)) mkString)
.filter (s => (exceptionStrings map (!s.contains(_)) fold true) (_ && _))
.opaque(s"string containing none of $exceptionStrings")
val string: Parser[String] =
(CharIn(alphaChars).! ~ CharPred(!nonStringChars.contains(_)).rep.!)
.opaque("unquoted string")
.map { case (s, ss) => s ++ ss }
val quotedString: Parser[String] =
(quote ~/ CharPred(!quoteChars.contains(_)).rep.! ~ quote)