TT

package
v0.0.0-...-c046b01 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 2, 2024 License: Apache-2.0 Imports: 15 Imported by: 0

README ¶

The Trust Library API

This library has been developed as a proof of concept concerning the use of trust in on line applications. Unlike most discussions of trust, this is not a heuristic cryptotoken or credential scheme. Rather it develops the idea that trust is a running assessment of a relationship between observer and observed in a system.

The code presented here is free to use, but it is to be thought of as a proof of concept, since much of it is tailored to the specific use cases for which data could be obtained. The code has been purposely designed with few entry points:

The purpose of any machine learning scheme is to condense inputs to provide a dimensional reduction of the original data.

Data types and SST context

The library makes use of the Semantic Spacetime model context, which has supporting infrastructure currently based on ArangoDB.

A Go application using the library can open an Analytics context as follows:

    TT.InitializeSmartSpaceTime()

    var dbname string = "SST-ML"
    var dburl string = "http://localhost:8529" // ArangoDB port
    var user string = "root"
    var pwd string = "mark"

    G = TT.OpenAnalytics(dbname,dburl,user,pwd)

Transaction wrappers

Two sets of functions for wrapping transactional events or critical sections parenthetically (with begin-end semantics).

  • Functions that timestamp transactions at the moment of measurement according to the local system clock
 PromiseContext_Begin(g Analytics, name string) PromiseContext 
 PromiseContext_End(g Analytics, ctx PromiseContext) PromiseHistory 
  • Functions with Golang time stamp supplied from outside, e.g. for offline analysis with epoch timestamps.
 StampedPromiseContext_Begin(g Analytics, name string, before time.Time) PromiseContext 
 StampedPromiseContext_End(g Analytics, ctx PromiseContext, after time.Time) PromiseHistory

Periodic Key Value Storage

Most time based data can be classified according to their point in a week from Monday to Friday with finite resolution of five minute intervals. Most system interactions have correlation times of up to 20 minutes. There is no need to collect nanosecond metrics for functioning systems. Each collection name (collname) represents a vector of values indexed by Unix or Golang timestamps which are turned into 5 minute intervals.

These functions fail silently with empty data rather than raising internal errors. The main reason for failure is incorrect naming.

 SumWeeklyKV(g Analytics, collname string, t int64, value float64)
 LearnWeeklyKV(g Analytics, collname string, t int64, value float64)

 AddWeeklyKV_Unix(g Analytics, collname string, t int64, value float64)
 AddWeeklyKV_Go(g Analytics, collname string, t time.Time, value float64)

To retrieve data from the learning store ordered by the working week,

 GetAllWeekMemory(g Analytics, collname string) []float64 

Text n-gram analysis

To scan a body of text, we first have to strip is clear of any encoding. Examples for handling UNICODE UTF-8 are given in the Chinese language n-gram example. One first selects an appropriate paragraph size for the kind of text one is analysing. For book text or narrative, a value of 100 is okay; for short notes a value of 10 is more appropriate. The learning radius determines the approximate amount of text that one considers to be coherently about the same thing. It affects the amount of the whole that gets subsampled during summarization.

	const paragraph_radius = 100
	return TT.TextPrism(subject, text, paragraph_radius)

In verbose mode, these generate a lot of helpful output to help understand the analysis.

Heuristics context

The TT library also contains a heuristic symbol evaluator, CFEngine style. This remains a simple lightweight approach more powerful than several others, and does a similar job to Open Policy Agent, etc. The interface is deliberately y simple.

-InitializeContext() - Reset and empty the set of active context classes

-ContextAdd(s string) - Add a symbol idempotently to the context set

-ContextSet() []string - Return an array of context symbols

-Context(expression string) bool - Evaluate whether the expression is true or false in terms of its heuristic score

Example:

TT.InitializeContext()

ContextAdd("busy")

if TT.Context("busy || lazy && slow") {
  // policy ...
}

fmt.Println("Context:",TT.ContextSet())

Documentation ¶

Index ¶

Constants ¶

View Source
const ASSESS_EXCELLENT = 1.0
View Source
const ASSESS_EXCELLENT_S = "potential_trustworthiness_high"
View Source
const ASSESS_PAR = 0.5
View Source
const ASSESS_PAR_S = "potential_trustworthiness_ok"
View Source
const ASSESS_SUBPAR = 0.0
View Source
const ASSESS_SUBPAR_S = "potential_untrusted"
View Source
const ASSESS_WEAK = 0.25
View Source
const ASSESS_WEAK_S = "potential_trustworthinss_low"
View Source
const CF_MEASURE_INTERVAL = 5 * 60
View Source
const CF_MONDAY_MORNING = 345200
View Source
const CF_SHIFT_INTERVAL = 6 * 3600
View Source
const FORGET_FRACTION = 0.001 // this amount per sentence ~ forget over 1000 words
View Source
const GR_CONTAINS int = 2
View Source
const GR_EXPRESSES int = 3
View Source
const GR_FOLLOWS int = 1
View Source
const GR_NEAR int = 4
View Source
const GR_NONE int = 0
View Source
const HOURS_PER_SHIFT = 6
View Source
const INITIAL_VALUE = 0.5
View Source
const LOCKDIR = "/tmp" // this should REALLY be a private, secure location
View Source
const LOWEST_INTENT_CUTOFF = 0.3 // cutoff for keeping n-grams, measured in intent
View Source
const MAXCLUSTERS = 7
View Source
const MEANING_THRESH = 20 // reduce this if too few samples
View Source
const MILLI = 1000000
View Source
const MINIMUM_FREQ_CUTOFF = 3 //         "                 , measured in occurrences
View Source
const MINUTES_PER_HOUR = 60
View Source
const MIN_LEGAL_KEYNAME = 3
View Source
const NANO = 1000000000
View Source
const NEVER = 0
View Source
const NOT_EXIST = 0
View Source
const REPEATED_HERE_AND_NOW = 1.0 // initial primer
View Source
const SECONDS_PER_DAY = (24 * SECONDS_PER_HOUR)
View Source
const SECONDS_PER_HOUR = (60 * SECONDS_PER_MINUTE)
View Source
const SECONDS_PER_MINUTE = 60
View Source
const SECONDS_PER_SHIFT = (HOURS_PER_SHIFT * SECONDS_PER_HOUR)
View Source
const SECONDS_PER_WEEK = (7 * SECONDS_PER_DAY)
View Source
const SECONDS_PER_YEAR = (365 * SECONDS_PER_DAY)
View Source
const SHIFTS_PER_DAY = 4
View Source
const SHIFTS_PER_WEEK = (4 * 7)

Variables ¶

View Source
var ALL_SENTENCE_INDEX int = 0
View Source
var ASSOCIATIONS = make(map[string]Association)
View Source
var ATTENTION_LEVEL float64 = 1.0
View Source
var CONTEXT map[string]float64
View Source
var EXCLUSIONS []string
View Source
var FORBIDDEN_ENDING = []string{"but", "and", "the", "or", "a", "an", "its", "it's", "their", "your", "my", "of", "as", "are", "is", "be", "with", "using", "that", "who", "to", "no", "because", "at", "but", "yes", "no", "yeah", "yay", "in"}
View Source
var FORBIDDEN_STARTER = []string{"and", "or", "of", "the", "it", "because", "in", "that", "these", "those", "is", "are", "was", "were", "but", "yes", "no", "yeah", "yay"}
View Source
var GR_DAY_TEXT = []string{
	"Monday",
	"Tuesday",
	"Wednesday",
	"Thursday",
	"Friday",
	"Saturday",
	"Sunday",
}
View Source
var GR_MONTH_TEXT = []string{
	"January",
	"February",
	"March",
	"April",
	"May",
	"June",
	"July",
	"August",
	"September",
	"October",
	"November",
	"December",
}
View Source
var GR_SHIFT_TEXT = []string{
	"Night",
	"Morning",
	"Afternoon",
	"Evening",
}
View Source
var KEPT int = 0
View Source
var LEGCOUNT int = 0
View Source
var LEG_SELECTIONS []string
View Source
var LEG_WINDOW int = 100 // sentences per leg
View Source
var LINKTYPES = []string{"none", "Follows", "Contains", "Expresses", "Near"}
View Source
var NODETYPES = []string{"topic", "ngram1", "ngram2", "ngram3", "ngram4", "ngram5", "ngram6", "event", "episode", "user", "signal"}
View Source
var SENTENCE_THRESH float64 = 100 // chars
View Source
var SKIPPED int = 0
View Source
var STM_NGRAM_RANK [MAXCLUSTERS]map[string]float64
View Source
var VERBOSE bool = false
View Source
var WORDCOUNT int = 0

Functions ¶

func AcquireLock ¶

func AcquireLock(name string)

func AddAssocKV ¶

func AddAssocKV(coll A.Collection, key string, assoc Association)

func AddEpisodeData ¶

func AddEpisodeData(g Analytics, key string, episode_data EpisodeSummary)

func AddKV ¶

func AddKV(g Analytics, collname string, kv KeyValue)
func AddLink(g Analytics, link Link)

func AddLinkCollection ¶

func AddLinkCollection(g Analytics, name string, nodecoll string) A.Collection

func AddNode ¶

func AddNode(g Analytics, kind string, node Node)

func AddNodeCollection ¶

func AddNodeCollection(g Analytics, name string) A.Collection

func AddPromiseHistory ¶

func AddPromiseHistory(g Analytics, coll A.Collection, coll_name string, e PromiseHistory)

func AddWeeklyKV_Go ¶

func AddWeeklyKV_Go(g Analytics, collname string, t time.Time, value float64)

func AddWeeklyKV_Unix ¶

func AddWeeklyKV_Unix(g Analytics, collname string, t int64, value float64)

func AlreadyLinkType ¶

func AlreadyLinkType(existing []ConnectionSemantics, newlnk ConnectionSemantics) bool

func AnnotateLeg ¶

func AnnotateLeg(filename string, selected_sentences []Narrative, leg int, sentence_id_by_rank map[float64]int, this_leg_av_rank, max float64)

func AppendFileValue ¶

func AppendFileValue(name string, value float64)

func AppendStringToFile ¶

func AppendStringToFile(name string, s string)

func AssessPromiseOutcome ¶

func AssessPromiseOutcome(g Analytics, e PromiseHistory, assessed_quality, promise_upper_bound, trust_interval float64) float64

func BelongsToLinSet ¶

func BelongsToLinSet(sets LinSet, member string) (bool, string, string)

func BelongsToSet ¶

func BelongsToSet(sets Set, member string) (bool, string, string)
func BlockLink(g Analytics, c1 Node, rel string, c2 Node, weight float64)

func CanonifyName ¶

func CanonifyName(s string) string

func CleanExpression ¶

func CleanExpression(s string) string

func Context ¶

func Context(s string) float64

func ContextAdd ¶

func ContextAdd(s string)

func ContextEval ¶

func ContextEval(s string) (string, float64)

func ContextSet ¶

func ContextSet() []string
func CreateLink(g Analytics, c1 Node, rel string, c2 Node, weight float64)

func DoughNowt ¶

func DoughNowt(then time.Time) (string, string)

func EndService ¶

func EndService(lock Lock)

func ExcludedByBindings ¶

func ExcludedByBindings(firstword, lastword string) bool

func FirstDerivative ¶

func FirstDerivative(e PromiseHistory, qscale, tscale float64) float64

func FractionateText2Ngrams ¶

func FractionateText2Ngrams(text string) [MAXCLUSTERS]map[string]float64

func FractionateThenRankSentence ¶

func FractionateThenRankSentence(s_idx int, sentence string, total_sentences int, ltm_every_ngram_occurrence [MAXCLUSTERS]map[string][]int) float64

func GetAdjacencyMatrixByInt ¶

func GetAdjacencyMatrixByInt(g Analytics, assoc_type string, symmetrize bool) ([][]float64, int, map[int]string)

func GetAdjacencyMatrixByKey ¶

func GetAdjacencyMatrixByKey(g Analytics, assoc_type string, symmetrize bool) map[VectorPair]float64

func GetAllWeekMemory ¶

func GetAllWeekMemory(g Analytics, collname string) []float64

func GetCollectionType ¶

func GetCollectionType(link Link) int

func GetFullAdjacencyMatrix ¶

func GetFullAdjacencyMatrix(g Analytics, symmetrize bool) ([][]float64, int, map[int]string)

func GetLinkKey ¶

func GetLinkKey(link Link) string

func GetLinkType ¶

func GetLinkType(sttype int) string

func GetLockTime ¶

func GetLockTime(filename string) int64

func GetNode ¶

func GetNode(g Analytics, key string) string

func GetPathsFrom ¶

func GetPathsFrom(g Analytics, layer int, startkey string, sttype int, visited map[string]bool) []string

func GetPrincipalEigenvector ¶

func GetPrincipalEigenvector(adjacency_matrix [][]float64, dim int) []float64

func GetUnixTimeKey ¶

func GetUnixTimeKey(now int64) string

func HashcodeSentenceSplit ¶

func HashcodeSentenceSplit(str string) string
func IncrLink(g Analytics, link Link)
func IncrementLink(g Analytics, c1 Node, rel string, c2 Node)

func InitializeContext ¶

func InitializeContext()

func InitializeSmartSpaceTime ¶

func InitializeSmartSpaceTime()

func InsertNodeIntoCollection ¶

func InsertNodeIntoCollection(g Analytics, node Node, coll A.Collection)

func Intentionality ¶

func Intentionality(n int, s string, sentence_count int) float64

func InvariantDescription ¶

func InvariantDescription(s string) string

func KeyName ¶

func KeyName(s string, n int) string
func LearnLink(g Analytics, c1 Node, rel string, c2 Node, weight float64)

func LearnWeeklyKV ¶

func LearnWeeklyKV(g Analytics, collname string, t int64, value float64)

func LinTogetherWith ¶

func LinTogetherWith(sets LinSet, a1, a2 string)

func LoadAssociations ¶

func LoadAssociations(db A.Database, coll_name string) map[string]Association

func LoadNgram ¶

func LoadNgram(g Analytics, n int)

func LoadNgrams ¶

func LoadNgrams(g Analytics)

func LoadPromiseHistoryKV2Map ¶

func LoadPromiseHistoryKV2Map(g Analytics, coll_name string, extkv map[string]PromiseHistory)

func LongitudinalPersistentConcepts ¶

func LongitudinalPersistentConcepts(topics map[string]float64) [MAXCLUSTERS]map[string]float64

func MatrixMultiplyVector ¶

func MatrixMultiplyVector(adj [][]float64, v []float64, dim int) []float64

func NextWordAndUpdateLTMNgrams ¶

func NextWordAndUpdateLTMNgrams(s_idx int, word string, rrbuffer [MAXCLUSTERS][]string, total_sentences int, ltm_every_ngram_occurrence [MAXCLUSTERS]map[string][]int) (float64, [MAXCLUSTERS][]string)

func OpenDatabase ¶

func OpenDatabase(name, url, user, pwd string) A.Database

func Paren ¶

func Paren(s string, offset int) (string, int)

func Print ¶

func Print(a ...any) (n int, err error)

func PrintMatrix ¶

func PrintMatrix(adjacency_matrix [][]float64, dim int, keys map[int]string)

func PrintNodes ¶

func PrintNodes(g Analytics, collection string)

func PrintPromiseHistoryKV ¶

func PrintPromiseHistoryKV(g Analytics, coll_name string)

func PrintVector ¶

func PrintVector(vec []float64, dim int, keys map[int]string)

func Printf ¶

func Printf(format string, args ...interface{}) (n int, err error)

func Println ¶

func Println(a ...any) (n int, err error)

func RandomAccept ¶

func RandomAccept(probability float64) bool

func RankByIntent ¶

func RankByIntent(selected_sentences []Narrative, ltm_every_ngram_occurrence [MAXCLUSTERS]map[string][]int) map[string]float64

func ReadAndCleanFile ¶

func ReadAndCleanFile(filename string) string

func RemoveLock ¶

func RemoveLock(name string)

func ReviewAndSelectEvents ¶

func ReviewAndSelectEvents(filename string, selected_sentences []Narrative)

func SaveAssociations ¶

func SaveAssociations(collname string, db A.Database, kv map[string]Association)

func SaveNgram ¶

func SaveNgram(g Analytics, n int, invariants [MAXCLUSTERS]map[string]float64)

func SaveNgrams ¶

func SaveNgrams(g Analytics, invariants [MAXCLUSTERS]map[string]float64)

func SavePromiseHistoryKVMap ¶

func SavePromiseHistoryKVMap(g Analytics, collname string, kv []PromiseHistory)

func SecondDerivative ¶

func SecondDerivative(e PromiseHistory, qscale, tscale float64) float64

func SplitIntoSentences ¶

func SplitIntoSentences(text string) []string

func SplitWithParensIntact ¶

func SplitWithParensIntact(expr string, split_ch byte) []string

func StaticIntent ¶

func StaticIntent(g Analytics, str string) float64

func SumWeeklyKV ¶

func SumWeeklyKV(g Analytics, collname string, t int64, value float64)

func TextPrism ¶

func TextPrism(subject, mainpage string, paragraph_radius int) [MAXCLUSTERS]map[string]float64

func TogetherWith ¶

func TogetherWith(sets Set, a1, a2 string)

func TrimParen ¶

func TrimParen(s string) string

func UpdatePromiseHistory ¶

func UpdatePromiseHistory(g Analytics, coll_name, key string, e PromiseHistory)

Types ¶

type AdjacencyMatrix ¶

type AdjacencyMatrix [][]float64

type Analytics ¶

type Analytics struct {
	S_db       A.Database
	S_graph    A.Graph
	S_Nodes    map[string]A.Collection
	S_Links    map[string]A.Collection
	S_Episodes A.Collection
	// contains filtered or unexported fields
}

func OpenAnalytics ¶

func OpenAnalytics(dbname, service_url, user, pwd string) Analytics

type Assessment ¶

type Assessment struct {
	Key     string  `json:"_key"`
	Id      string  `json:"agent"`
	Outcome float64 `json:"outcome"`
}

type Association ¶

type Association struct {
	Key string `json:"_key"`

	STtype int    `json:"STType"`
	Fwd    string `json:"Fwd"`
	Bwd    string `json:"Bwd"`
	NFwd   string `json:"NFwd"`
	NBwd   string `json:"NBwd"`
}

type ConnectionSemantics ¶

type ConnectionSemantics struct {
	LinkType string // Association type
	From     string // Node key pointed to

	FwdSrc string
	BwdSrc string
}

type EpisodeSummary ¶

type EpisodeSummary struct {
	Key string `json:"_key"`

	L  float64 `json: L`            // 1 article text (work output)
	LL float64 `json: LL`           // 2
	N  float64 `json: N`            // 3 average users per episode
	NL float64 `json: NL`           // 4
	I  float64 `json: I`            // 7 mistrust signals per unit text length
	W  float64 `json: W`            // 9 H/L - mistrusted work ratio (sampled article/article work)
	U  float64 `json: U`            // 11 sampled process discussion/sampled article work ratio
	M  float64 `json: M`            // 13 s/H - mistrust level (sampled history/history work)
	TG float64 `json: TG`           // 15 av episode duration per episode
	TU float64 `json: TU`           // 16 av episode duration per episode user
	BF float64 `json: Bot_fraction` // 21 bots/human users
}

func GetEpisodeData ¶

func GetEpisodeData(g Analytics, key string) EpisodeSummary

type KeyValue ¶

type KeyValue struct {
	K string  `json:"_key"`
	R string  `json:"raw_key"`
	V float64 `json:"value"`
}

func GetKV ¶

func GetKV(g Analytics, collname, key string) KeyValue

type LinSet ¶

type LinSet map[string][]string
type Link struct {
	From   string  `json:"_from"`     // mandatory field
	To     string  `json:"_to"`       // mandatory field
	SId    string  `json:"semantics"` // Matches Association key
	Negate bool    `json:"negation"`  // is this enable or block?
	Weight float64 `json:"weight"`
	Key    string  `json:"_key"` // mandatory field (handle)
}
func ReadLink(g Analytics, c1 Node, rel string, c2 Node, weight float64) (Link, bool)

type List ¶

type List []string

type Lock ¶

type Lock struct {
	Ready bool
	This  string
	Last  string
}

func BeginService ¶

func BeginService(name string, ifelapsed, expireafter int64, now int64) Lock

type MatrixRow ¶

type MatrixRow []float64

type Name ¶

type Name string

type Narrative ¶

type Narrative struct {
	// contains filtered or unexported fields
}

func FractionateSentences ¶

func FractionateSentences(text string) ([]Narrative, [MAXCLUSTERS]map[string][]int)

func NarrationMarker ¶

func NarrationMarker(text string, rank float64, index int) Narrative

type Neighbours ¶

type Neighbours []int

type Node ¶

type Node struct {
	Key    string  `json:"_key"`   // mandatory field (handle) - short name
	Data   string  `json: "data"`  // Longer description or bulk string data
	Prefix string  `json:"prefix"` // Collection: Hub, Node, Fragment?
	Weight float64 `json:"weight"` // importance rank

	Gap   int64 `json:"gap"`   // punctuation interval
	Begin int64 `json:"begin"` // Date of start
	End   int64 `json:"end"`   // Date of end
}

func CreateNode ¶

func CreateNode(g Analytics, kind, short_description, vardescription string, weight float64, gap, begin, end int64) Node

func GetFullNode ¶

func GetFullNode(g Analytics, key string) Node

func NextDataEvent ¶

func NextDataEvent(g *Analytics, thread, collection, shortkey, data string, gap, begin, end int64) Node

func PreviousEvent ¶

func PreviousEvent(g *Analytics, thread string) Node

type PromiseContext ¶

type PromiseContext struct {
	Time  time.Time
	Name  string
	Plock Lock
}

func PromiseContext_Begin ¶

func PromiseContext_Begin(g Analytics, name string) PromiseContext

func StampedPromiseContext_Begin ¶

func StampedPromiseContext_Begin(g Analytics, name string, before time.Time) PromiseContext

type PromiseHistory ¶

type PromiseHistory struct {
	PromiseId string `json:"_key"`

	Q  float64 `json:"q"`
	Q1 float64 `json:"q1"`
	Q2 float64 `json:"q2"`

	Q_av  float64 `json:"q_av"`
	Q_var float64 `json:"q_var"`

	T  int64 `json:"lastT"`
	T1 int64 `json:"lastT1"`
	T2 int64 `json:"lastT2"`

	Dt_av  float64 `json:"dT"`
	Dt_var float64 `json:"dT_var"`

	V     float64 `json:"V"`
	AntiT float64 `json:"antiT"`

	Units string `json:"units"`
}

func GetPromiseHistory ¶

func GetPromiseHistory(g Analytics, collname, key string) (bool, PromiseHistory, A.Collection)

func LearnUpdateKeyValue ¶

func LearnUpdateKeyValue(g Analytics, coll_name, key string, now int64, q float64, units string) PromiseHistory

func PromiseContext_End ¶

func PromiseContext_End(g Analytics, ctx PromiseContext) PromiseHistory

func StampedPromiseContext_End ¶

func StampedPromiseContext_End(g Analytics, ctx PromiseContext, after time.Time) PromiseHistory

type Score ¶

type Score struct {
	Key   string
	Score float64
}

type SemanticLinkSet ¶

type SemanticLinkSet map[string][]ConnectionSemantics

func GetNeighboursOf ¶

func GetNeighboursOf(g Analytics, node string, sttype int, direction string) SemanticLinkSet

func GetPredecessorsOf ¶

func GetPredecessorsOf(g Analytics, node string, sttype int) SemanticLinkSet

func GetSuccessorsOf ¶

func GetSuccessorsOf(g Analytics, node string, sttype int) SemanticLinkSet

func InitializeSemanticLinkSet ¶

func InitializeSemanticLinkSet(start string) SemanticLinkSet

type Set ¶

type Set map[string]map[string]string

type VectorPair ¶

type VectorPair struct {
	From string
	To   string
}

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL