upload KG Reasoning
This commit is contained in:
23
KG Reasoning/ATOMIC2020/data/sample.tsv
Normal file
23
KG Reasoning/ATOMIC2020/data/sample.tsv
Normal file
@@ -0,0 +1,23 @@
|
||||
PersonX also saw ___ isFilledBy birds
|
||||
wage war HasSubEvent cities destroyed
|
||||
PersonX goes from zero to hero xNeed assess a situation
|
||||
PersonX is walking on the beach xAttr alone
|
||||
PersonX cuts open ___ oEffect operated on
|
||||
PersonX becomes PersonY member xReact included
|
||||
kerosene ObjectUse become arsonist
|
||||
PersonX loves PersonX's dog xEffect he want get a dog
|
||||
PersonX stops kissing PersonY HinderedBy PersonY is crying out of control and PersonX is full of compassion.
|
||||
PersonX looks at PersonY like that isAfter PersonX is on a date with PersonY
|
||||
PersonX quickly fell in love oReact happy.
|
||||
PersonX loves PersonX's work xIntent none
|
||||
PersonX takes PersonY everywhere xWant refill up with gas
|
||||
PersonX gets close to PersonY oWant to bond with x
|
||||
jellyfish AtLocation potato sout
|
||||
PersonX makes PersonY's laugh isBefore PersonX gets a drink for PersonY
|
||||
mouse MadeUpOf mouse button
|
||||
stallion CapableOf service mare
|
||||
sand HasProperty white
|
||||
gardener NotDesires plants to die
|
||||
hot weather Causes fainting
|
||||
stay in bed xReason of cold
|
||||
hen Desires bread crumbs
|
||||
|
23
KG Reasoning/ATOMIC2020/prompts/prompt_0.txt
Normal file
23
KG Reasoning/ATOMIC2020/prompts/prompt_0.txt
Normal file
@@ -0,0 +1,23 @@
|
||||
predict the tail entity [MASK] from the given (PersonX also saw ___, isFilledBy, [MASK]) by completing the sentence ""PersonX also saw [MASK].".
|
||||
predict the tail entity [MASK] from the given (wage war, HasSubEvent, [MASK]) by completing the sentence "While wage war,you would ".
|
||||
predict the tail entity [MASK] from the given (PersonX goes from zero to hero, xNeed, [MASK]) by completing the sentence "PersonX goes from zero to hero. Before, PersonX needs to ".
|
||||
predict the tail entity [MASK] from the given (PersonX is walking on the beach, xAttr, [MASK]) by completing the sentence "PersonX is walking on the beach. PersonX is ".
|
||||
predict the tail entity [MASK] from the given (PersonX cuts open ___, oEffect, [MASK]) by completing the sentence "PersonX cuts open . The effect on PersonY will be ".
|
||||
predict the tail entity [MASK] from the given (PersonX becomes PersonY member, xReact, [MASK]) by completing the sentence "PersonX becomes PersonY member. PersonX will be ".
|
||||
predict the tail entity [MASK] from the given (kerosene, ObjectUse, [MASK]) by completing the sentence "Kerosene can be used for ".
|
||||
predict the tail entity [MASK] from the given (PersonX loves PersonX's dog, xEffect, [MASK]) by completing the sentence "PersonX loves PersonX's dog. The effect on PersonX will be".
|
||||
predict the tail entity [MASK] from the given (PersonX stops kissing PersonY, HinderedBy, [MASK]) by completing the sentence "PersonX stops kissing PersonY, which would not happen if ".
|
||||
predict the tail entity [MASK] from the given (PersonX looks at PersonY like that, isAfter, [MASK]) by completing the sentence "PersonX looks at PersonY like that. Before that, PersonX".
|
||||
predict the tail entity [MASK] from the given (PersonX quickly fell in love, oReact, [MASK]) by completing the sentence "PersonX quickly fell in love. As a result, others feel".
|
||||
predict the tail entity [MASK] from the given (PersonX loves PersonX's work, xIntent, [MASK]) by completing the sentence "PersonX loves PersonX's work. PersonX did this to".
|
||||
predict the tail entity [MASK] from the given (PersonX takes PersonY everywhere, xWant, [MASK]) by completing the sentence "PersonX takes PersonY everywhere. After, PersonX will want to ".
|
||||
predict the tail entity [MASK] from the given (PersonX gets close to PersonY, oWant, [MASK]) by completing the sentence "PersonX gets close to PersonY. After, others will want to".
|
||||
predict the tail entity [MASK] from the given (jellyfish, AtLocation, [MASK]) by completing the sentence "You are likely to find jellyfish in".
|
||||
predict the tail entity [MASK] from the given (PersonX makes PersonY's laugh, isBefore, [MASK]) by completing the sentence "PersonX makes PersonY's laugh. After that, PersonX ".
|
||||
predict the tail entity [MASK] from the given (mouse, MadeUpOf, [MASK]) by completing the sentence "A mouse contains".
|
||||
predict the tail entity [MASK] from the given (stallion, CapableOf, [MASK]) by completing the sentence "Stallion can".
|
||||
predict the tail entity [MASK] from the given (sand, HasProperty, [MASK]) by completing the sentence "Sand is".
|
||||
predict the tail entity [MASK] from the given (gardener, NotDesires, [MASK]) by completing the sentence "gardener does not desire".
|
||||
predict the tail entity [MASK] from the given (hot weather, Causes, [MASK]) by completing the sentence "Hot weather Causes".
|
||||
predict the tail entity [MASK] from the given (stay in bed, xReason, [MASK]) by completing the sentence "PersonX did stay in bed because".
|
||||
predict the tail entity [MASK] from the given (hen, Desires, [MASK]) by completing the sentence "hen desires".
|
||||
49
KG Reasoning/ATOMIC2020/prompts/prompt_1.txt
Normal file
49
KG Reasoning/ATOMIC2020/prompts/prompt_1.txt
Normal file
@@ -0,0 +1,49 @@
|
||||
predict the tail entity [MASK] from the given (PersonX puts the ___ away, isFilledBy, [MASK]) by completing the sentence "PersonX puts the [MASK] away. [MASK] is filled by".The answer is wine.
|
||||
predict the tail entity [MASK] from the given (PersonX also saw ___, isFilledBy, [MASK]) by completing the sentence "PersonX also saw [MASK]. [MASK] is filled by".The answer is
|
||||
predict the tail entity [MASK] from the given (celebrate, HasSubEvent, [MASK]) by completing the sentence "While celebrate,you would".The answer is throw party.
|
||||
predict the tail entity [MASK] from the given (wage war, HasSubEvent, [MASK]) by completing the sentence "While wage war,you would".The answer is
|
||||
predict the tail entity [MASK] from the given (PersonX plants ___ in PersonX's backyard, xNeed, [MASK]) by completing the sentence "PersonX plants ___ in PersonX's backyard. Before, PersonX needs to ".The answer is to dig the holes.
|
||||
predict the tail entity [MASK] from the given (PersonX goes from zero to hero, xNeed, [MASK]) by completing the sentence "PersonX goes from zero to hero. Before, PersonX needs to ".The answer is
|
||||
"PersonX throws the ___ into confusion xAttr convincing
|
||||
predict the tail entity [MASK] from the given (PersonX throws the ___ into confusion, xAttr, [MASK]) by completing the sentence "PersonX throws the ___ into confusion. PersonX is ".The answer is convincing.
|
||||
predict the tail entity [MASK] from the given (PersonX is walking on the beach, xAttr, [MASK]) by completing the sentence "PersonX is walking on the beach. PersonX is ".The answer is
|
||||
predict the tail entity [MASK] from the given (PersonX serves PersonY's purpose, oEffect, [MASK]) by completing the sentence "PersonX serves PersonY's purpose . The effect on PersonY will be ".The answer is none.
|
||||
predict the tail entity [MASK] from the given (PersonX cuts open ___, oEffect, [MASK]) by completing the sentence "PersonX cuts open . The effect on PersonY will be ".The answer is
|
||||
predict the tail entity [MASK] from the given (PersonX uploads the video, xReact, [MASK]) by completing the sentence "PersonX uploads the video. PersonX will be ".The answer is excited.
|
||||
predict the tail entity [MASK] from the given (PersonX becomes PersonY member, xReact, [MASK]) by completing the sentence "PersonX becomes PersonY member. PersonX will be ".The answer is
|
||||
predict the tail entity [MASK] from the given (gas, ObjectUse, [MASK]) by completing the sentence "gas can be used for ".The answer is put in the car.
|
||||
predict the tail entity [MASK] from the given (kerosene, ObjectUse, [MASK]) by completing the sentence "Kerosene can be used for ".The answer is
|
||||
predict the tail entity [MASK] from the given (PersonX runs every ___, xEffect, [MASK]) by completing the sentence "PersonX runs every ___. The effect on PersonX will be".The answer is gets tired.
|
||||
predict the tail entity [MASK] from the given (PersonX loves PersonX's dog, xEffect, [MASK]) by completing the sentence "PersonX loves PersonX's dog. The effect on PersonX will be".The answer is
|
||||
"PersonX never paid HinderedBy PersonX's roommate said they need to pay or get out.
|
||||
predict the tail entity [MASK] from the given (PersonX never paid, HinderedBy, [MASK]) by completing the sentence "PersonX never paid, which would not happen if ".The answer is PersonX's roommate said they need to pay or get out..
|
||||
predict the tail entity [MASK] from the given (PersonX stops kissing PersonY, HinderedBy, [MASK]) by completing the sentence "PersonX stops kissing PersonY, which would not happen if ".The answer is
|
||||
"PersonX redoes PersonY's kitchen isAfter PersonX went to PersonY's house
|
||||
predict the tail entity [MASK] from the given (PersonX redoes PersonY's kitchen, isAfter, [MASK]) by completing the sentence "PersonX redoes PersonY's kitchen. Before that, PersonX".The answer is PersonX went to PersonY's house.
|
||||
predict the tail entity [MASK] from the given (PersonX looks at PersonY like that, isAfter, [MASK]) by completing the sentence "PersonX looks at PersonY like that. Before that, PersonX".The answer is
|
||||
predict the tail entity [MASK] from the given (PersonX takes ___ on vacation, oReact, [MASK]) by completing the sentence "PersonX takes ___ on vacation. As a result, others feel".The answer is excited.
|
||||
predict the tail entity [MASK] from the given (PersonX quickly fell in love, oReact, [MASK]) by completing the sentence "PersonX quickly fell in love. As a result, others feel".The answer is
|
||||
predict the tail entity [MASK] from the given (PersonX throws ___ out the window, xIntent, [MASK]) by completing the sentence "PersonX throws ___ out the window. PersonX did this to".The answer is to have peace.
|
||||
predict the tail entity [MASK] from the given (PersonX loves PersonX's work, xIntent, [MASK]) by completing the sentence "PersonX loves PersonX's work. PersonX did this to".The answer is
|
||||
predict the tail entity [MASK] from the given (PersonX bears the ___ longer, xWant, [MASK]) by completing the sentence "PersonX bears the ___ longer. After, PersonX will want to ".The answer is to eat some chocolate.
|
||||
predict the tail entity [MASK] from the given (PersonX takes PersonY everywhere, xWant, [MASK]) by completing the sentence "PersonX takes PersonY everywhere. After, PersonX will want to ".The answer is
|
||||
predict the tail entity [MASK] from the given (PersonX always put ___, oWant, [MASK]) by completing the sentence "PersonX always put ___. After, others will want to".The answer is none.
|
||||
predict the tail entity [MASK] from the given (PersonX gets close to PersonY, oWant, [MASK]) by completing the sentence "PersonX gets close to PersonY. After, others will want to".The answer is
|
||||
predict the tail entity [MASK] from the given (omega, AtLocation, [MASK]) by completing the sentence "You are likely to find jellyfish in".The answer is united states.
|
||||
predict the tail entity [MASK] from the given (jellyfish, AtLocation, [MASK]) by completing the sentence "You are likely to find jellyfish in".The answer is
|
||||
predict the tail entity [MASK] from the given (PersonX decides to pull over, isBefore, [MASK]) by completing the sentence "PersonX decides to pull over. After that, PersonX ".The answer is PersonX goes in the store.
|
||||
predict the tail entity [MASK] from the given (PersonX makes PersonY's laugh, isBefore, [MASK]) by completing the sentence "PersonX makes PersonY's laugh. After that, PersonX ".The answer is
|
||||
predict the tail entity [MASK] from the given (everything, MadeUpOf, [MASK]) by completing the sentence "everything contains".The answer is matter.
|
||||
predict the tail entity [MASK] from the given (mouse, MadeUpOf, [MASK]) by completing the sentence "A mouse contains".The answer is
|
||||
predict the tail entity [MASK] from the given (fisherman, CapableOf, [MASK]) by completing the sentence "fisherman can".The answer is land fish.
|
||||
predict the tail entity [MASK] from the given (stallion, CapableOf, [MASK]) by completing the sentence "Stallion can".The answer is
|
||||
predict the tail entity [MASK] from the given (text, HasProperty, [MASK]) by completing the sentence "text is".The answer is read.
|
||||
predict the tail entity [MASK] from the given (sand, HasProperty, [MASK]) by completing the sentence "Sand is".The answer is
|
||||
predict the tail entity [MASK] from the given (person, NotDesires, [MASK]) by completing the sentence "person does not desire".The answer is losy car keys.
|
||||
predict the tail entity [MASK] from the given (gardener, NotDesires, [MASK]) by completing the sentence "gardener does not desire".The answer is
|
||||
predict the tail entity [MASK] from the given (subject, Causes, [MASK]) by completing the sentence "subject".The answer is experience.
|
||||
predict the tail entity [MASK] from the given (hot weather, Causes, [MASK]) by completing the sentence "Hot weather Causes".The answer is"
|
||||
predict the tail entity [MASK] from the given (die, xReason, [MASK]) by completing the sentence "die because".The answer is of disease.
|
||||
predict the tail entity [MASK] from the given (stay in bed, xReason, [MASK]) by completing the sentence "PersonX did stay in bed because".The answer is
|
||||
predict the tail entity [MASK] from the given (person, Desires, [MASK]) by completing the sentence "person desires".The answer is make connections between images.
|
||||
predict the tail entity [MASK] from the given (hen, Desires, [MASK]) by completing the sentence "hen desires".The answer is
|
||||
35
KG Reasoning/ATOMIC2020/sample.py
Normal file
35
KG Reasoning/ATOMIC2020/sample.py
Normal file
@@ -0,0 +1,35 @@
|
||||
import random
|
||||
w = open('sample.tsv','w')
|
||||
f2 =open("train_sample.tsv")
|
||||
f3=open("test_sample.tsv")
|
||||
|
||||
l=[]
|
||||
ll=[]
|
||||
for i in f2.readlines():
|
||||
l.append(i.strip('\n'))
|
||||
ll.append(i.split('\t')[1])
|
||||
|
||||
ss=[]
|
||||
for i in f3.readlines():
|
||||
ss.append(i.strip('\n').split(' ')[1])
|
||||
|
||||
|
||||
|
||||
rel_set=[]
|
||||
lines =f.readlines()
|
||||
random.shuffle(lines)
|
||||
print(lines[0])
|
||||
for line in lines:
|
||||
head,rel,tail=line.strip('\n').split('\t')
|
||||
# print(rel)
|
||||
if rel not in rel_set:
|
||||
rel_set.append(rel)
|
||||
s=ss[ll.index(rel)].split(' ')[-1]
|
||||
sh=l[ll.index(rel)].split('\t')[0]
|
||||
w.write(line)
|
||||
w.write("predict the tail entity [MASK] from the given ({}, {}, [MASK]) by completing the sentence \"{}\".The answer is {}.\n".format(head,rel,s,tail))
|
||||
w.write("predict the tail entity [MASK] from the given ({}, {}, [MASK]) by completing the sentence \"{}\".The answer is \n".format(sh,rel,s))
|
||||
|
||||
print(len(rel_set))
|
||||
|
||||
|
||||
23
KG Reasoning/ATOMIC2020/system_eval/BART/1_CHATGPT.json
Normal file
23
KG Reasoning/ATOMIC2020/system_eval/BART/1_CHATGPT.json
Normal file
@@ -0,0 +1,23 @@
|
||||
{"head": "PersonX also saw ___", "relation": "isFilledBy", "tails": "birds", "generations": ["none"], "greedy": "none"}
|
||||
{"head": "wage war", "relation": "HasSubEvent", "tails": "cities destroyed", "generations": ["none"], "greedy": "none"}
|
||||
{"head": "PersonX goes from zero to hero", "relation": "xNeed", "tails": "assess a situation", "generations": ["put in hard work and dedication to achieve their goals."], "greedy": "put in hard work and dedication to achieve their goals."}
|
||||
{"head": "PersonX is walking on the beach", "relation": "xAttr", "tails": "alone", "generations": ["enjoying the scenery and the peaceful atmosphere."], "greedy": "enjoying the scenery and the peaceful atmosphere."}
|
||||
{"head": "PersonX cuts open ___", "relation": "oEffect", "tails": "operated on", "generations": [" unclear"], "greedy": " unclear"}
|
||||
{"head": "PersonX becomes PersonY member", "relation": "xReact", "tails": "included", "generations": ["happy and possibly eager to participate in activities or events associated with the membership."], "greedy": "happy and possibly eager to participate in activities or events associated with the membership."}
|
||||
{"head": "kerosene", "relation": "ObjectUse", "tails": "become arsonist", "generations": ["fueling lamps", " stoves", " and heaters."], "greedy": "fueling lamps"}
|
||||
{"head": "PersonX loves PersonX's dog", "relation": "xEffect", "tails": "he want get a dog", "generations": ["feeling happy and fulfilled", "having a stronger bond with their pet."], "greedy": "feeling happy and fulfilled"}
|
||||
{"head": "PersonX stops kissing PersonY", "relation": "HinderedBy", "tails": "PersonY is crying out of control and PersonX is full of compassion.", "generations": [" they both wanted to continue"], "greedy": " they both wanted to continue"}
|
||||
{"head": "PersonX looks at PersonY like that", "relation": "isAfter", "tails": "PersonX is on a date with PersonY", "generations": [" unknown"], "greedy": " unknown"}
|
||||
{"head": "PersonX quickly fell in love", "relation": "oReact", "tails": "happy.", "generations": ["happy"], "greedy": "happy"}
|
||||
{"head": "PersonX loves PersonX's work", "relation": "xIntent", "tails": "none", "generations": [" to feel fulfilled and accomplished."], "greedy": " to feel fulfilled and accomplished."}
|
||||
{"head": "PersonX takes PersonY everywhere", "relation": "xWant", "tails": "refill up with gas", "generations": ["spend some quality time with PersonY."], "greedy": "spend some quality time with PersonY."}
|
||||
{"head": "PersonX gets close to PersonY", "relation": "oWant", "tails": "to bond with x", "generations": ["get to know PersonY better."], "greedy": "get to know PersonY better."}
|
||||
{"head": "jellyfish", "relation": "AtLocation", "tails": "potato sout", "generations": ["the ocean", " sea."], "greedy": "the ocean"}
|
||||
{"head": "PersonX makes PersonY's laugh", "relation": "isBefore", "tails": "PersonX gets a drink for PersonY", "generations": ["feels happy ", "satisfied."], "greedy": "feels happy "}
|
||||
{"head": "mouse", "relation": "MadeUpOf", "tails": "mouse button", "generations": ["cells ", "organs."], "greedy": "cells "}
|
||||
{"head": "stallion", "relation": "CapableOf", "tails": "service mare", "generations": ["mate with multiple mares", "run very fast"], "greedy": "mate with multiple mares"}
|
||||
{"head": "sand", "relation": "HasProperty", "tails": "white", "generations": ["gritty ", " granular."], "greedy": "gritty "}
|
||||
{"head": "gardener", "relation": "NotDesires", "tails": "plants to die", "generations": ["to see their plants die or wither."], "greedy": "to see their plants die or wither."}
|
||||
{"head": "hot weather", "relation": "Causes", "tails": "fainting", "generations": ["dehydration ", " sweating."], "greedy": "dehydration "}
|
||||
{"head": "stay in bed", "relation": "xReason", "tails": "of cold", "generations": ["they were feeling unwell or tired."], "greedy": "they were feeling unwell or tired."}
|
||||
{"head": "hen", "relation": "Desires", "tails": "bread crumbs", "generations": ["to lay eggs ", "to find a safe place to roost."], "greedy": "to lay eggs "}
|
||||
@@ -0,0 +1,23 @@
|
||||
{"generation": "none", "references": ["the dolphin", "the whale", "monuments", "boss", "trees", "coworker", "the car", "flowers", "birds"], "input": {"head": "PersonX also saw ___", "relation": "isFilledBy"}}
|
||||
{"generation": "none", "references": ["attack enemy", "people get killed", "reconsider all options", "cities destroyed", "fight", "people become underfed", "people protest", "write proclamation declaring war", "assination", "people become injured or killed", "become angry", "human lives lost", "victory", "prepare for battle", "people in armies would fight", "stop fighting", "bombs droped", "arms makers grow more wealthy", "killing enemy soldiers", "declare war", "win or lose", "attrition", "get bad", "defeat", "engage deception", "attack opposing armies", "innocent people get killed", "national borders change", "death", "people become refugees", "people become homeless", "negotiate peace", "may get into struggle"], "input": {"head": "wage war", "relation": "HasSubEvent"}}
|
||||
{"generation": "put in hard work and dedication to achieve their goals.", "references": ["PersonX is a wimp", "PersonX got sick from Scott", "PersonX has been knocked out by his girlfriend", "PersonX does not have the courage to confront the situation.", "PersonX does not have the right equipment to defeat the enemy.", "PersonX does not have a gym membership", "PersonX has no skills", "PersonX is afraid", "PersonX was made too weak by his mom's diet"], "input": {"head": "PersonX goes from zero to hero", "relation": "HinderedBy"}}
|
||||
{"generation": "enjoying the scenery and the peaceful atmosphere.", "references": ["content", "healthy", "relaxed", "alone", "restive", "blissful", "care-free"], "input": {"head": "PersonX is walking on the beach", "relation": "xAttr"}}
|
||||
{"generation": " unclear", "references": ["rushed to hospital", "none", "operated on"], "input": {"head": "PersonX cuts open ___", "relation": "oEffect"}}
|
||||
{"generation": "happy and possibly eager to participate in activities or events associated with the membership.", "references": ["included"], "input": {"head": "PersonX becomes PersonY member", "relation": "xReact"}}
|
||||
{"generation": "fueling lamps", "references": ["create heat", "kill head lice", "soak a rag", "clean paint off a wall", "clean off grease", "burn someone's house who cheated on you", "remove a messbecom", "disinfect a wound", "become arsonist"], "input": {"head": "kerosene", "relation": "ObjectUse"}}
|
||||
{"generation": "feeling happy and fulfilled", "references": ["want to feed a dog", "none", "he want get a dog", "he interested a pets", "he jalouse for another"], "input": {"head": "PersonX loves PersonX's dog", "relation": "xEffect"}}
|
||||
{"generation": " they both wanted to continue", "references": ["PersonX is out of control and has become overcome by his sexual desires.", "PersonY is holding PersonX down.", "They are in love with Person Y", "PersonY will leave PersonX.", "They are feeling the passion of the moment.", "PersonY is crying out of control and PersonX is full of compassion."], "input": {"head": "PersonX stops kissing PersonY", "relation": "HinderedBy"}}
|
||||
{"generation": " unknown", "references": ["PersonX is on a date with PersonY"], "input": {"head": "PersonX looks at PersonY like that", "relation": "isAfter"}}
|
||||
{"generation": "happy", "references": ["glad", "overjoyed", "happy."], "input": {"head": "PersonX quickly fell in love", "relation": "oReact"}}
|
||||
{"generation": " to feel fulfilled and accomplished.", "references": ["to be productive.", "to be accomplished", "have a job they love", "none", "to succeed."], "input": {"head": "PersonX loves PersonX's work", "relation": "xIntent"}}
|
||||
{"generation": "spend some quality time with PersonY.", "references": ["take person Y home", "to eat dinner", "to get PersonY familiar with the area", "to rest", "refill up with gas", "to answer PersonY's questions"], "input": {"head": "PersonX takes PersonY everywhere", "relation": "xWant"}}
|
||||
{"generation": "get to know PersonY better.", "references": ["to spend time with X", "to marry X", "to bond with x", "to reciprocate the feelings", "to avoid x", "to spend as much time with them as they can"], "input": {"head": "PersonX gets close to PersonY", "relation": "oWant"}}
|
||||
{"generation": "the ocean", "references": ["ocean", "potato sout", "all oceans of world", "most oceans", "surf", "sea at coral reefs", "salt water", "tidal waters", "shores washed up and dead", "thai restaurant", "monterey bay aquarium", "north sea", "warm ocean", "peanut butter pool", "coral reef", "marine aquarium", "thesand", "public aquarium", "chesapeake bay", "restaurant with strange menu", "mediterranean sea", "open ocean", "zoo", "pervet's bedroom", "tropical waters", "aqurium", "see", "jelly bean", "warm ocean water", "detroit zoo", "book", "ocea", "sandwith", "gulf", "sushi restaurant", "encyclopedia", "tidal pools", "current", "pond", "weirdest places", "cartoon", "penny candy aisle", "tropical body of water", "international waters", "lake", "atlantic ocean", "ocean or aquarium", "japanese restaurant", "chinese entree", "warm sea", "art", "baltimore aquarium", "maui", "jamaca", "movie", "sea water", "oriental restaurant", "bay", "smack", "oceans and seas", "deep ocean", "texas", "saltwater", "hand", "saardina", "underwater", "hawaii", "monterey bay", "florida", "tank", "oceanic trench", "chinese restaurant", "cuba", "jungle", "photographs", "warm ocean waters", "osean", "red sea", "store", "aquarium", "calm waters", "pacific ocean", "sea world", "bathing suit", "shore", "ocean water"], "input": {"head": "jellyfish", "relation": "AtLocation"}}
|
||||
{"generation": "feels happy ", "references": ["PersonX gets a drink for PersonY"], "input": {"head": "PersonX makes PersonY's laugh", "relation": "isBefore"}}
|
||||
{"generation": "cells ", "references": ["mouse button", "mouse wheel"], "input": {"head": "mouse", "relation": "MadeUpOf"}}
|
||||
{"generation": "mate with multiple mares", "references": ["service mare", "cover mare"], "input": {"head": "stallion", "relation": "CapableOf"}}
|
||||
{"generation": "gritty ", "references": ["found at beach", "found in desert", "dry and small", "found on beach", "played in", "black", "made into glass", "found on beaches", "white", "made up of tiny ground rocks", "gritty"], "input": {"head": "sand", "relation": "HasProperty"}}
|
||||
{"generation": "to see their plants die or wither.", "references": ["plants to die"], "input": {"head": "gardener", "relation": "NotDesires"}}
|
||||
{"generation": "dehydration ", "references": ["fainting"], "input": {"head": "hot weather", "relation": "Causes"}}
|
||||
{"generation": "they were feeling unwell or tired.", "references": ["of cold", "were sick", "feel ill", "you're sick", "you're still tired", "don't feel well", "back hurts", "have sore throat"], "input": {"head": "stay in bed", "relation": "xReason"}}
|
||||
{"generation": "to lay eggs ", "references": ["bread crumbs"], "input": {"head": "hen", "relation": "Desires"}}
|
||||
@@ -0,0 +1,3 @@
|
||||
{"bleu1": 0.589095933543675, "bleu2": 0.43007299678252164, "bleu3": 0.3144539231662121, "bleu4": 0.24769028426942935, "HinderedBy": 0.8564833973910988, "xAttr": 0.4489795918367347, "oEffect": 0.6013781189421007, "xReact": 0.07368421052631577, "ObjectUse": 0.925961078642316, "xEffect": 0.7037037037037037, "isAfter": 0.032952700217555565, "oReact": 1.0, "xIntent": 0.6666666666666666, "xWant": 0.918918918918919, "oWant": 0.8148148148148148, "AtLocation": 1.0, "isBefore": 0.11017743498857775, "MadeUpOf": 0.28973213900471884, "CapableOf": 0.375, "HasProperty": 1.0, "NotDesires": 0.38235294117647056, "Causes": 0.3333333333333333, "xReason": 0.6470588235294118, "Desires": 0.3333333333333333}
|
||||
bleu1 bleu2 bleu3 bleu4 HinderedBy xAttr oEffect xReact ObjectUse xEffect isAfter oReact xIntent xWant oWant AtLocation isBefore MadeUpOf CapableOf HasProperty NotDesires Causes xReason Desires
|
||||
0.589095933543675 0.43007299678252164 0.3144539231662121 0.24769028426942935 0.8564833973910988 0.4489795918367347 0.6013781189421007 0.07368421052631577 0.925961078642316 0.7037037037037037 0.032952700217555565 1.0 0.6666666666666666 0.918918918918919 0.8148148148148148 1.0 0.11017743498857775 0.28973213900471884 0.375 1.0 0.38235294117647056 0.3333333333333333 0.6470588235294118 0.3333333333333333
|
||||
23
KG Reasoning/ATOMIC2020/system_eval/BART/1_GPT4.json
Normal file
23
KG Reasoning/ATOMIC2020/system_eval/BART/1_GPT4.json
Normal file
@@ -0,0 +1,23 @@
|
||||
{"head": "PersonX also saw ___", "relation": "isFilledBy", "tails": "birds", "generations": ["wine"], "greedy": "wine"}
|
||||
{"head": "wage war", "relation": "HasSubEvent", "tails": "cities destroyed", "generations": ["engage in battles"], "greedy": "engage in battles"}
|
||||
{"head": "PersonX goes from zero to hero", "relation": "xNeed", "tails": "assess a situation", "generations": ["train and overcome challenges"], "greedy": "train and overcome challenges"}
|
||||
{"head": "PersonX is walking on the beach", "relation": "xAttr", "tails": "alone", "generations": ["relaxed"], "greedy": "relaxed"}
|
||||
{"head": "PersonX cuts open ___", "relation": "oEffect", "tails": "operated on", "generations": [" getting access to the contents"], "greedy": " getting access to the contents"}
|
||||
{"head": "PersonX becomes PersonY member", "relation": "xReact", "tails": "included", "generations": ["proud"], "greedy": "proud"}
|
||||
{"head": "kerosene", "relation": "ObjectUse", "tails": "become arsonist", "generations": ["fueling lamps and heaters"], "greedy": "fueling lamps and heaters"}
|
||||
{"head": "PersonX loves PersonX's dog", "relation": "xEffect", "tails": "he want get a dog", "generations": ["happiness and companionship"], "greedy": "happiness and companionship"}
|
||||
{"head": "PersonX stops kissing PersonY", "relation": "HinderedBy", "tails": "PersonY is crying out of control and PersonX is full of compassion.", "generations": ["they felt comfortable and attracted to each other."], "greedy": "they felt comfortable and attracted to each other."}
|
||||
{"head": "PersonX looks at PersonY like that", "relation": "isAfter", "tails": "PersonX is on a date with PersonY", "generations": [" PersonX had a conversation with PersonY."], "greedy": " PersonX had a conversation with PersonY."}
|
||||
{"head": "PersonX quickly fell in love", "relation": "oReact", "tails": "happy.", "generations": ["happy"], "greedy": "happy"}
|
||||
{"head": "PersonX loves PersonX's work", "relation": "xIntent", "tails": "none", "generations": [" show appreciation."], "greedy": " show appreciation."}
|
||||
{"head": "PersonX takes PersonY everywhere", "relation": "xWant", "tails": "refill up with gas", "generations": ["rest and relax."], "greedy": "rest and relax."}
|
||||
{"head": "PersonX gets close to PersonY", "relation": "oWant", "tails": "to bond with x", "generations": ["get to know PersonX better."], "greedy": "get to know PersonX better."}
|
||||
{"head": "jellyfish", "relation": "AtLocation", "tails": "potato sout", "generations": ["ocean waters."], "greedy": "ocean waters."}
|
||||
{"head": "PersonX makes PersonY's laugh", "relation": "isBefore", "tails": "PersonX gets a drink for PersonY", "generations": ["continues talking to PersonY"], "greedy": "continues talking to PersonY"}
|
||||
{"head": "mouse", "relation": "MadeUpOf", "tails": "mouse button", "generations": ["cells and tissues."], "greedy": "cells and tissues."}
|
||||
{"head": "stallion", "relation": "CapableOf", "tails": "service mare", "generations": ["run fast."], "greedy": "run fast."}
|
||||
{"head": "sand", "relation": "HasProperty", "tails": "white", "generations": ["grainy."], "greedy": "grainy."}
|
||||
{"head": "gardener", "relation": "NotDesires", "tails": "plants to die", "generations": ["plant diseases."], "greedy": "plant diseases."}
|
||||
{"head": "hot weather", "relation": "Causes", "tails": "fainting", "generations": ["sweating"], "greedy": "sweating"}
|
||||
{"head": "stay in bed", "relation": "xReason", "tails": "of cold", "generations": ["they were sick."], "greedy": "they were sick."}
|
||||
{"head": "hen", "relation": "Desires", "tails": "bread crumbs", "generations": ["to lay eggs."], "greedy": "to lay eggs."}
|
||||
23
KG Reasoning/ATOMIC2020/system_eval/BART/1_GPT4_gens.jsonl
Normal file
23
KG Reasoning/ATOMIC2020/system_eval/BART/1_GPT4_gens.jsonl
Normal file
@@ -0,0 +1,23 @@
|
||||
{"generation": "wine", "references": ["the dolphin", "the whale", "monuments", "boss", "trees", "coworker", "the car", "flowers", "birds"], "input": {"head": "PersonX also saw ___", "relation": "isFilledBy"}}
|
||||
{"generation": "engage in battles", "references": ["attack enemy", "people get killed", "reconsider all options", "cities destroyed", "fight", "people become underfed", "people protest", "write proclamation declaring war", "assination", "people become injured or killed", "become angry", "human lives lost", "victory", "prepare for battle", "people in armies would fight", "stop fighting", "bombs droped", "arms makers grow more wealthy", "killing enemy soldiers", "declare war", "win or lose", "attrition", "get bad", "defeat", "engage deception", "attack opposing armies", "innocent people get killed", "national borders change", "death", "people become refugees", "people become homeless", "negotiate peace", "may get into struggle"], "input": {"head": "wage war", "relation": "HasSubEvent"}}
|
||||
{"generation": "train and overcome challenges", "references": ["PersonX is a wimp", "PersonX got sick from Scott", "PersonX has been knocked out by his girlfriend", "PersonX does not have the courage to confront the situation.", "PersonX does not have the right equipment to defeat the enemy.", "PersonX does not have a gym membership", "PersonX has no skills", "PersonX is afraid", "PersonX was made too weak by his mom's diet"], "input": {"head": "PersonX goes from zero to hero", "relation": "HinderedBy"}}
|
||||
{"generation": "relaxed", "references": ["content", "healthy", "relaxed", "alone", "restive", "blissful", "care-free"], "input": {"head": "PersonX is walking on the beach", "relation": "xAttr"}}
|
||||
{"generation": " getting access to the contents", "references": ["rushed to hospital", "none", "operated on"], "input": {"head": "PersonX cuts open ___", "relation": "oEffect"}}
|
||||
{"generation": "proud", "references": ["included"], "input": {"head": "PersonX becomes PersonY member", "relation": "xReact"}}
|
||||
{"generation": "fueling lamps and heaters", "references": ["create heat", "kill head lice", "soak a rag", "clean paint off a wall", "clean off grease", "burn someone's house who cheated on you", "remove a messbecom", "disinfect a wound", "become arsonist"], "input": {"head": "kerosene", "relation": "ObjectUse"}}
|
||||
{"generation": "happiness and companionship", "references": ["want to feed a dog", "none", "he want get a dog", "he interested a pets", "he jalouse for another"], "input": {"head": "PersonX loves PersonX's dog", "relation": "xEffect"}}
|
||||
{"generation": "they felt comfortable and attracted to each other.", "references": ["PersonX is out of control and has become overcome by his sexual desires.", "PersonY is holding PersonX down.", "They are in love with Person Y", "PersonY will leave PersonX.", "They are feeling the passion of the moment.", "PersonY is crying out of control and PersonX is full of compassion."], "input": {"head": "PersonX stops kissing PersonY", "relation": "HinderedBy"}}
|
||||
{"generation": " PersonX had a conversation with PersonY.", "references": ["PersonX is on a date with PersonY"], "input": {"head": "PersonX looks at PersonY like that", "relation": "isAfter"}}
|
||||
{"generation": "happy", "references": ["glad", "overjoyed", "happy."], "input": {"head": "PersonX quickly fell in love", "relation": "oReact"}}
|
||||
{"generation": " show appreciation.", "references": ["to be productive.", "to be accomplished", "have a job they love", "none", "to succeed."], "input": {"head": "PersonX loves PersonX's work", "relation": "xIntent"}}
|
||||
{"generation": "rest and relax.", "references": ["take person Y home", "to eat dinner", "to get PersonY familiar with the area", "to rest", "refill up with gas", "to answer PersonY's questions"], "input": {"head": "PersonX takes PersonY everywhere", "relation": "xWant"}}
|
||||
{"generation": "get to know PersonX better.", "references": ["to spend time with X", "to marry X", "to bond with x", "to reciprocate the feelings", "to avoid x", "to spend as much time with them as they can"], "input": {"head": "PersonX gets close to PersonY", "relation": "oWant"}}
|
||||
{"generation": "ocean waters.", "references": ["ocean", "potato sout", "all oceans of world", "most oceans", "surf", "sea at coral reefs", "salt water", "tidal waters", "shores washed up and dead", "thai restaurant", "monterey bay aquarium", "north sea", "warm ocean", "peanut butter pool", "coral reef", "marine aquarium", "thesand", "public aquarium", "chesapeake bay", "restaurant with strange menu", "mediterranean sea", "open ocean", "zoo", "pervet's bedroom", "tropical waters", "aqurium", "see", "jelly bean", "warm ocean water", "detroit zoo", "book", "ocea", "sandwith", "gulf", "sushi restaurant", "encyclopedia", "tidal pools", "current", "pond", "weirdest places", "cartoon", "penny candy aisle", "tropical body of water", "international waters", "lake", "atlantic ocean", "ocean or aquarium", "japanese restaurant", "chinese entree", "warm sea", "art", "baltimore aquarium", "maui", "jamaca", "movie", "sea water", "oriental restaurant", "bay", "smack", "oceans and seas", "deep ocean", "texas", "saltwater", "hand", "saardina", "underwater", "hawaii", "monterey bay", "florida", "tank", "oceanic trench", "chinese restaurant", "cuba", "jungle", "photographs", "warm ocean waters", "osean", "red sea", "store", "aquarium", "calm waters", "pacific ocean", "sea world", "bathing suit", "shore", "ocean water"], "input": {"head": "jellyfish", "relation": "AtLocation"}}
|
||||
{"generation": "continues talking to PersonY", "references": ["PersonX gets a drink for PersonY"], "input": {"head": "PersonX makes PersonY's laugh", "relation": "isBefore"}}
|
||||
{"generation": "cells and tissues.", "references": ["mouse button", "mouse wheel"], "input": {"head": "mouse", "relation": "MadeUpOf"}}
|
||||
{"generation": "run fast.", "references": ["service mare", "cover mare"], "input": {"head": "stallion", "relation": "CapableOf"}}
|
||||
{"generation": "grainy.", "references": ["found at beach", "found in desert", "dry and small", "found on beach", "played in", "black", "made into glass", "found on beaches", "white", "made up of tiny ground rocks", "gritty"], "input": {"head": "sand", "relation": "HasProperty"}}
|
||||
{"generation": "plant diseases.", "references": ["plants to die"], "input": {"head": "gardener", "relation": "NotDesires"}}
|
||||
{"generation": "sweating", "references": ["fainting"], "input": {"head": "hot weather", "relation": "Causes"}}
|
||||
{"generation": "they were sick.", "references": ["of cold", "were sick", "feel ill", "you're sick", "you're still tired", "don't feel well", "back hurts", "have sore throat"], "input": {"head": "stay in bed", "relation": "xReason"}}
|
||||
{"generation": "to lay eggs.", "references": ["bread crumbs"], "input": {"head": "hen", "relation": "Desires"}}
|
||||
@@ -0,0 +1,3 @@
|
||||
{"bleu1": 0.7544501039594714, "bleu2": 0.5695388887353877, "bleu3": 0.40817203874133706, "bleu4": 0.3102100106754677, "isFilledBy": 1.0, "HasSubEvent": 1.0, "HinderedBy": 0.9299999999999999, "xAttr": 1.0, "oEffect": 0.4838709677419355, "xReact": 0.21952465443761057, "ObjectUse": 1.0, "xEffect": 0.5925925925925926, "isAfter": 0.8048780487804879, "oReact": 1.0, "xIntent": 0.8421052631578947, "xWant": 0.8666666666666667, "oWant": 0.8518518518518519, "AtLocation": 0.9230769230769231, "isBefore": 0.6501584248126363, "MadeUpOf": 0.4444444444444444, "CapableOf": 0.39770636302860873, "HasProperty": 0.8571428571428571, "NotDesires": 0.6666666666666666, "Causes": 0.625, "xReason": 0.9333333333333333, "Desires": 0.3333333333333333}
|
||||
bleu1 bleu2 bleu3 bleu4 isFilledBy HasSubEvent HinderedBy xAttr oEffect xReact ObjectUse xEffect isAfter oReact xIntent xWant oWant AtLocation isBefore MadeUpOf CapableOf HasProperty NotDesires Causes xReason Desires
|
||||
0.7544501039594714 0.5695388887353877 0.40817203874133706 0.3102100106754677 1.0 1.0 0.9299999999999999 1.0 0.4838709677419355 0.21952465443761057 1.0 0.5925925925925926 0.8048780487804879 1.0 0.8421052631578947 0.8666666666666667 0.8518518518518519 0.9230769230769231 0.6501584248126363 0.4444444444444444 0.39770636302860873 0.8571428571428571 0.6666666666666666 0.625 0.9333333333333333 0.3333333333333333
|
||||
23
KG Reasoning/ATOMIC2020/system_eval/BART/1_TEXT003.json
Normal file
23
KG Reasoning/ATOMIC2020/system_eval/BART/1_TEXT003.json
Normal file
@@ -0,0 +1,23 @@
|
||||
{"head": "PersonX also saw ___", "relation": "isFilledBy", "tails": "birds", "generations": ["a menu"], "greedy": "a menu"}
|
||||
{"head": "wage war", "relation": "HasSubEvent", "tails": "cities destroyed", "generations": ["launch attack"], "greedy": "launch attack"}
|
||||
{"head": "PersonX goes from zero to hero", "relation": "xNeed", "tails": "assess a situation", "generations": ["to take action"], "greedy": "to take action"}
|
||||
{"head": "PersonX is walking on the beach", "relation": "xAttr", "tails": "alone", "generations": ["enjoying the scenery"], "greedy": "enjoying the scenery"}
|
||||
{"head": "PersonX cuts open ___", "relation": "oEffect", "tails": "operated on", "generations": [" pain"], "greedy": " pain"}
|
||||
{"head": "PersonX becomes PersonY member", "relation": "xReact", "tails": "included", "generations": ["thrilled"], "greedy": "thrilled"}
|
||||
{"head": "kerosene", "relation": "ObjectUse", "tails": "become arsonist", "generations": ["heating"], "greedy": "heating"}
|
||||
{"head": "PersonX loves PersonX's dog", "relation": "xEffect", "tails": "he want get a dog", "generations": ["feeling of joy"], "greedy": "feeling of joy"}
|
||||
{"head": "PersonX stops kissing PersonY", "relation": "HinderedBy", "tails": "PersonY is crying out of control and PersonX is full of compassion.", "generations": [" PersonY was not uncomfortable."], "greedy": " PersonY was not uncomfortable."}
|
||||
{"head": "PersonX looks at PersonY like that", "relation": "isAfter", "tails": "PersonX is on a date with PersonY", "generations": [" PersonX did something."], "greedy": " PersonX did something."}
|
||||
{"head": "PersonX quickly fell in love", "relation": "oReact", "tails": "happy.", "generations": ["envious"], "greedy": "envious"}
|
||||
{"head": "PersonX loves PersonX's work", "relation": "xIntent", "tails": "none", "generations": [" find fulfillment"], "greedy": " find fulfillment"}
|
||||
{"head": "PersonX takes PersonY everywhere", "relation": "xWant", "tails": "refill up with gas", "generations": ["spend time alone."], "greedy": "spend time alone."}
|
||||
{"head": "PersonX gets close to PersonY", "relation": "oWant", "tails": "to bond with x", "generations": ["emulate"], "greedy": "emulate"}
|
||||
{"head": "jellyfish", "relation": "AtLocation", "tails": "potato sout", "generations": ["oceans"], "greedy": "oceans"}
|
||||
{"head": "PersonX makes PersonY's laugh", "relation": "isBefore", "tails": "PersonX gets a drink for PersonY", "generations": ["PersonX continues to make PersonY laugh"], "greedy": "PersonX continues to make PersonY laugh"}
|
||||
{"head": "mouse", "relation": "MadeUpOf", "tails": "mouse button", "generations": ["components"], "greedy": "components"}
|
||||
{"head": "stallion", "relation": "CapableOf", "tails": "service mare", "generations": ["pull a carriage"], "greedy": "pull a carriage"}
|
||||
{"head": "sand", "relation": "HasProperty", "tails": "white", "generations": ["coarse"], "greedy": "coarse"}
|
||||
{"head": "gardener", "relation": "NotDesires", "tails": "plants to die", "generations": ["weeds"], "greedy": "weeds"}
|
||||
{"head": "hot weather", "relation": "Causes", "tails": "fainting", "generations": ["dehydration"], "greedy": "dehydration"}
|
||||
{"head": "stay in bed", "relation": "xReason", "tails": "of cold", "generations": ["of illness"], "greedy": "of illness"}
|
||||
{"head": "hen", "relation": "Desires", "tails": "bread crumbs", "generations": ["lay eggs"], "greedy": "lay eggs"}
|
||||
@@ -0,0 +1,23 @@
|
||||
{"generation": "a menu", "references": ["the dolphin", "the whale", "monuments", "boss", "trees", "coworker", "the car", "flowers", "birds"], "input": {"head": "PersonX also saw ___", "relation": "isFilledBy"}}
|
||||
{"generation": "launch attack", "references": ["attack enemy", "people get killed", "reconsider all options", "cities destroyed", "fight", "people become underfed", "people protest", "write proclamation declaring war", "assination", "people become injured or killed", "become angry", "human lives lost", "victory", "prepare for battle", "people in armies would fight", "stop fighting", "bombs droped", "arms makers grow more wealthy", "killing enemy soldiers", "declare war", "win or lose", "attrition", "get bad", "defeat", "engage deception", "attack opposing armies", "innocent people get killed", "national borders change", "death", "people become refugees", "people become homeless", "negotiate peace", "may get into struggle"], "input": {"head": "wage war", "relation": "HasSubEvent"}}
|
||||
{"generation": "to take action", "references": ["PersonX is a wimp", "PersonX got sick from Scott", "PersonX has been knocked out by his girlfriend", "PersonX does not have the courage to confront the situation.", "PersonX does not have the right equipment to defeat the enemy.", "PersonX does not have a gym membership", "PersonX has no skills", "PersonX is afraid", "PersonX was made too weak by his mom's diet"], "input": {"head": "PersonX goes from zero to hero", "relation": "HinderedBy"}}
|
||||
{"generation": "enjoying the scenery", "references": ["content", "healthy", "relaxed", "alone", "restive", "blissful", "care-free"], "input": {"head": "PersonX is walking on the beach", "relation": "xAttr"}}
|
||||
{"generation": " pain", "references": ["rushed to hospital", "none", "operated on"], "input": {"head": "PersonX cuts open ___", "relation": "oEffect"}}
|
||||
{"generation": "thrilled", "references": ["included"], "input": {"head": "PersonX becomes PersonY member", "relation": "xReact"}}
|
||||
{"generation": "heating", "references": ["create heat", "kill head lice", "soak a rag", "clean paint off a wall", "clean off grease", "burn someone's house who cheated on you", "remove a messbecom", "disinfect a wound", "become arsonist"], "input": {"head": "kerosene", "relation": "ObjectUse"}}
|
||||
{"generation": "feeling of joy", "references": ["want to feed a dog", "none", "he want get a dog", "he interested a pets", "he jalouse for another"], "input": {"head": "PersonX loves PersonX's dog", "relation": "xEffect"}}
|
||||
{"generation": " PersonY was not uncomfortable.", "references": ["PersonX is out of control and has become overcome by his sexual desires.", "PersonY is holding PersonX down.", "They are in love with Person Y", "PersonY will leave PersonX.", "They are feeling the passion of the moment.", "PersonY is crying out of control and PersonX is full of compassion."], "input": {"head": "PersonX stops kissing PersonY", "relation": "HinderedBy"}}
|
||||
{"generation": " PersonX did something.", "references": ["PersonX is on a date with PersonY"], "input": {"head": "PersonX looks at PersonY like that", "relation": "isAfter"}}
|
||||
{"generation": "envious", "references": ["glad", "overjoyed", "happy."], "input": {"head": "PersonX quickly fell in love", "relation": "oReact"}}
|
||||
{"generation": " find fulfillment", "references": ["to be productive.", "to be accomplished", "have a job they love", "none", "to succeed."], "input": {"head": "PersonX loves PersonX's work", "relation": "xIntent"}}
|
||||
{"generation": "spend time alone.", "references": ["take person Y home", "to eat dinner", "to get PersonY familiar with the area", "to rest", "refill up with gas", "to answer PersonY's questions"], "input": {"head": "PersonX takes PersonY everywhere", "relation": "xWant"}}
|
||||
{"generation": "emulate", "references": ["to spend time with X", "to marry X", "to bond with x", "to reciprocate the feelings", "to avoid x", "to spend as much time with them as they can"], "input": {"head": "PersonX gets close to PersonY", "relation": "oWant"}}
|
||||
{"generation": "oceans", "references": ["ocean", "potato sout", "all oceans of world", "most oceans", "surf", "sea at coral reefs", "salt water", "tidal waters", "shores washed up and dead", "thai restaurant", "monterey bay aquarium", "north sea", "warm ocean", "peanut butter pool", "coral reef", "marine aquarium", "thesand", "public aquarium", "chesapeake bay", "restaurant with strange menu", "mediterranean sea", "open ocean", "zoo", "pervet's bedroom", "tropical waters", "aqurium", "see", "jelly bean", "warm ocean water", "detroit zoo", "book", "ocea", "sandwith", "gulf", "sushi restaurant", "encyclopedia", "tidal pools", "current", "pond", "weirdest places", "cartoon", "penny candy aisle", "tropical body of water", "international waters", "lake", "atlantic ocean", "ocean or aquarium", "japanese restaurant", "chinese entree", "warm sea", "art", "baltimore aquarium", "maui", "jamaca", "movie", "sea water", "oriental restaurant", "bay", "smack", "oceans and seas", "deep ocean", "texas", "saltwater", "hand", "saardina", "underwater", "hawaii", "monterey bay", "florida", "tank", "oceanic trench", "chinese restaurant", "cuba", "jungle", "photographs", "warm ocean waters", "osean", "red sea", "store", "aquarium", "calm waters", "pacific ocean", "sea world", "bathing suit", "shore", "ocean water"], "input": {"head": "jellyfish", "relation": "AtLocation"}}
|
||||
{"generation": "PersonX continues to make PersonY laugh", "references": ["PersonX gets a drink for PersonY"], "input": {"head": "PersonX makes PersonY's laugh", "relation": "isBefore"}}
|
||||
{"generation": "components", "references": ["mouse button", "mouse wheel"], "input": {"head": "mouse", "relation": "MadeUpOf"}}
|
||||
{"generation": "pull a carriage", "references": ["service mare", "cover mare"], "input": {"head": "stallion", "relation": "CapableOf"}}
|
||||
{"generation": "coarse", "references": ["found at beach", "found in desert", "dry and small", "found on beach", "played in", "black", "made into glass", "found on beaches", "white", "made up of tiny ground rocks", "gritty"], "input": {"head": "sand", "relation": "HasProperty"}}
|
||||
{"generation": "weeds", "references": ["plants to die"], "input": {"head": "gardener", "relation": "NotDesires"}}
|
||||
{"generation": "dehydration", "references": ["fainting"], "input": {"head": "hot weather", "relation": "Causes"}}
|
||||
{"generation": "of illness", "references": ["of cold", "were sick", "feel ill", "you're sick", "you're still tired", "don't feel well", "back hurts", "have sore throat"], "input": {"head": "stay in bed", "relation": "xReason"}}
|
||||
{"generation": "lay eggs", "references": ["bread crumbs"], "input": {"head": "hen", "relation": "Desires"}}
|
||||
@@ -0,0 +1,3 @@
|
||||
{"bleu1": 0.6937263337019739, "bleu2": 0.41581408881136733, "bleu3": 0.25740888835768805, "bleu4": 0.1625873794954571, "isFilledBy": 1.0, "HasSubEvent": 1.0, "HinderedBy": 0.9035588735026947, "xAttr": 0.65, "oEffect": 1.0, "xReact": 0.5, "ObjectUse": 0.6514390575310556, "xEffect": 0.6918152117189051, "isAfter": 0.5348131499823614, "oReact": 0.42857142857142855, "xIntent": 0.6470588235294118, "xWant": 0.8874100177457646, "oWant": 0.6514390575310556, "AtLocation": 1.0, "isBefore": 0.717948717948718, "MadeUpOf": 0.6333861926251716, "CapableOf": 0.4666666666666667, "HasProperty": 1.0, "NotDesires": 0.12113791079679323, "Causes": 0.36363636363636365, "xReason": 0.9, "Desires": 0.3032653298563167}
|
||||
bleu1 bleu2 bleu3 bleu4 isFilledBy HasSubEvent HinderedBy xAttr oEffect xReact ObjectUse xEffect isAfter oReact xIntent xWant oWant AtLocation isBefore MadeUpOf CapableOf HasProperty NotDesires Causes xReason Desires
|
||||
0.6937263337019739 0.41581408881136733 0.25740888835768805 0.1625873794954571 1.0 1.0 0.9035588735026947 0.65 1.0 0.5 0.6514390575310556 0.6918152117189051 0.5348131499823614 0.42857142857142855 0.6470588235294118 0.8874100177457646 0.6514390575310556 1.0 0.717948717948718 0.6333861926251716 0.4666666666666667 1.0 0.12113791079679323 0.36363636363636365 0.9 0.3032653298563167
|
||||
23
KG Reasoning/ATOMIC2020/system_eval/BART/GPT4.json
Normal file
23
KG Reasoning/ATOMIC2020/system_eval/BART/GPT4.json
Normal file
@@ -0,0 +1,23 @@
|
||||
{"head": "PersonX also saw ___", "relation": "isFilledBy", "tails": "birds", "generations": ["a movie"], "greedy": "a movie"}
|
||||
{"head": "wage war", "relation": "HasSubEvent", "tails": "cities destroyed", "generations": ["engage in battles"], "greedy": "engage in battles"}
|
||||
{"head": "PersonX goes from zero to hero", "relation": "xNeed", "tails": "assess a situation", "generations": ["overcome obstacles and develop their skills"], "greedy": "overcome obstacles and develop their skills"}
|
||||
{"head": "PersonX is walking on the beach", "relation": "xAttr", "tails": "alone", "generations": ["relaxed"], "greedy": "relaxed"}
|
||||
{"head": "PersonX cuts open ___", "relation": "oEffect", "tails": "operated on", "generations": [" receiving their items"], "greedy": " receiving their items"}
|
||||
{"head": "PersonX becomes PersonY member", "relation": "xReact", "tails": "included", "generations": ["excited"], "greedy": "excited"}
|
||||
{"head": "kerosene", "relation": "ObjectUse", "tails": "become arsonist", "generations": ["fueling heaters", "lamps", "stoves"], "greedy": "fueling heaters"}
|
||||
{"head": "PersonX loves PersonX's dog", "relation": "xEffect", "tails": "he want get a dog", "generations": ["happiness and companionship"], "greedy": "happiness and companionship"}
|
||||
{"head": "PersonX stops kissing PersonY", "relation": "HinderedBy", "tails": "PersonY is crying out of control and PersonX is full of compassion.", "generations": ["they were not interrupted or uncomfortable"], "greedy": "they were not interrupted or uncomfortable"}
|
||||
{"head": "PersonX looks at PersonY like that", "relation": "isAfter", "tails": "PersonX is on a date with PersonY", "generations": [" experienced a strong emotion or had a significant interaction with PersonY"], "greedy": " experienced a strong emotion or had a significant interaction with PersonY"}
|
||||
{"head": "PersonX quickly fell in love", "relation": "oReact", "tails": "happy.", "generations": ["surprised or happy for them."], "greedy": "surprised or happy for them."}
|
||||
{"head": "PersonX loves PersonX's work", "relation": "xIntent", "tails": "none", "generations": [" achieve personal satisfaction and fulfillment"], "greedy": " achieve personal satisfaction and fulfillment"}
|
||||
{"head": "PersonX takes PersonY everywhere", "relation": "xWant", "tails": "refill up with gas", "generations": ["share their experiences", "memories together"], "greedy": "share their experiences"}
|
||||
{"head": "PersonX gets close to PersonY", "relation": "oWant", "tails": "to bond with x", "generations": ["bond better", "understand their relationship"], "greedy": "bond better"}
|
||||
{"head": "jellyfish", "relation": "AtLocation", "tails": "potato sout", "generations": ["the ocean"], "greedy": "the ocean"}
|
||||
{"head": "PersonX makes PersonY's laugh", "relation": "isBefore", "tails": "PersonX gets a drink for PersonY", "generations": ["feels a sense of accomplishment and connection with PersonY"], "greedy": "feels a sense of accomplishment and connection with PersonY"}
|
||||
{"head": "mouse", "relation": "MadeUpOf", "tails": "mouse button", "generations": ["bones", " muscles", " organs", " and fur"], "greedy": "bones"}
|
||||
{"head": "stallion", "relation": "CapableOf", "tails": "service mare", "generations": ["gallop at high speeds"], "greedy": "gallop at high speeds"}
|
||||
{"head": "sand", "relation": "HasProperty", "tails": "white", "generations": ["granular and composed of small particles"], "greedy": "granular and composed of small particles"}
|
||||
{"head": "gardener", "relation": "NotDesires", "tails": "plants to die", "generations": ["dead or diseased plants."], "greedy": "dead or diseased plants."}
|
||||
{"head": "hot weather", "relation": "Causes", "tails": "fainting", "generations": ["increased sweating and discomfort"], "greedy": "increased sweating and discomfort"}
|
||||
{"head": "stay in bed", "relation": "xReason", "tails": "of cold", "generations": ["they were feeling unwell or tired"], "greedy": "they were feeling unwell or tired"}
|
||||
{"head": "hen", "relation": "Desires", "tails": "bread crumbs", "generations": ["a safe place to lay eggs and find food"], "greedy": "a safe place to lay eggs and find food"}
|
||||
23
KG Reasoning/ATOMIC2020/system_eval/BART/GPT4_gens.jsonl
Normal file
23
KG Reasoning/ATOMIC2020/system_eval/BART/GPT4_gens.jsonl
Normal file
@@ -0,0 +1,23 @@
|
||||
{"generation": "a movie", "references": ["the dolphin", "the whale", "monuments", "boss", "trees", "coworker", "the car", "flowers", "birds"], "input": {"head": "PersonX also saw ___", "relation": "isFilledBy"}}
|
||||
{"generation": "engage in battles", "references": ["attack enemy", "people get killed", "reconsider all options", "cities destroyed", "fight", "people become underfed", "people protest", "write proclamation declaring war", "assination", "people become injured or killed", "become angry", "human lives lost", "victory", "prepare for battle", "people in armies would fight", "stop fighting", "bombs droped", "arms makers grow more wealthy", "killing enemy soldiers", "declare war", "win or lose", "attrition", "get bad", "defeat", "engage deception", "attack opposing armies", "innocent people get killed", "national borders change", "death", "people become refugees", "people become homeless", "negotiate peace", "may get into struggle"], "input": {"head": "wage war", "relation": "HasSubEvent"}}
|
||||
{"generation": "overcome obstacles and develop their skills", "references": ["PersonX is a wimp", "PersonX got sick from Scott", "PersonX has been knocked out by his girlfriend", "PersonX does not have the courage to confront the situation.", "PersonX does not have the right equipment to defeat the enemy.", "PersonX does not have a gym membership", "PersonX has no skills", "PersonX is afraid", "PersonX was made too weak by his mom's diet"], "input": {"head": "PersonX goes from zero to hero", "relation": "HinderedBy"}}
|
||||
{"generation": "relaxed", "references": ["content", "healthy", "relaxed", "alone", "restive", "blissful", "care-free"], "input": {"head": "PersonX is walking on the beach", "relation": "xAttr"}}
|
||||
{"generation": " receiving their items", "references": ["rushed to hospital", "none", "operated on"], "input": {"head": "PersonX cuts open ___", "relation": "oEffect"}}
|
||||
{"generation": "excited", "references": ["included"], "input": {"head": "PersonX becomes PersonY member", "relation": "xReact"}}
|
||||
{"generation": "fueling heaters", "references": ["create heat", "kill head lice", "soak a rag", "clean paint off a wall", "clean off grease", "burn someone's house who cheated on you", "remove a messbecom", "disinfect a wound", "become arsonist"], "input": {"head": "kerosene", "relation": "ObjectUse"}}
|
||||
{"generation": "happiness and companionship", "references": ["want to feed a dog", "none", "he want get a dog", "he interested a pets", "he jalouse for another"], "input": {"head": "PersonX loves PersonX's dog", "relation": "xEffect"}}
|
||||
{"generation": "they were not interrupted or uncomfortable", "references": ["PersonX is out of control and has become overcome by his sexual desires.", "PersonY is holding PersonX down.", "They are in love with Person Y", "PersonY will leave PersonX.", "They are feeling the passion of the moment.", "PersonY is crying out of control and PersonX is full of compassion."], "input": {"head": "PersonX stops kissing PersonY", "relation": "HinderedBy"}}
|
||||
{"generation": " experienced a strong emotion or had a significant interaction with PersonY", "references": ["PersonX is on a date with PersonY"], "input": {"head": "PersonX looks at PersonY like that", "relation": "isAfter"}}
|
||||
{"generation": "surprised or happy for them.", "references": ["glad", "overjoyed", "happy."], "input": {"head": "PersonX quickly fell in love", "relation": "oReact"}}
|
||||
{"generation": " achieve personal satisfaction and fulfillment", "references": ["to be productive.", "to be accomplished", "have a job they love", "none", "to succeed."], "input": {"head": "PersonX loves PersonX's work", "relation": "xIntent"}}
|
||||
{"generation": "share their experiences", "references": ["take person Y home", "to eat dinner", "to get PersonY familiar with the area", "to rest", "refill up with gas", "to answer PersonY's questions"], "input": {"head": "PersonX takes PersonY everywhere", "relation": "xWant"}}
|
||||
{"generation": "bond better", "references": ["to spend time with X", "to marry X", "to bond with x", "to reciprocate the feelings", "to avoid x", "to spend as much time with them as they can"], "input": {"head": "PersonX gets close to PersonY", "relation": "oWant"}}
|
||||
{"generation": "the ocean", "references": ["ocean", "potato sout", "all oceans of world", "most oceans", "surf", "sea at coral reefs", "salt water", "tidal waters", "shores washed up and dead", "thai restaurant", "monterey bay aquarium", "north sea", "warm ocean", "peanut butter pool", "coral reef", "marine aquarium", "thesand", "public aquarium", "chesapeake bay", "restaurant with strange menu", "mediterranean sea", "open ocean", "zoo", "pervet's bedroom", "tropical waters", "aqurium", "see", "jelly bean", "warm ocean water", "detroit zoo", "book", "ocea", "sandwith", "gulf", "sushi restaurant", "encyclopedia", "tidal pools", "current", "pond", "weirdest places", "cartoon", "penny candy aisle", "tropical body of water", "international waters", "lake", "atlantic ocean", "ocean or aquarium", "japanese restaurant", "chinese entree", "warm sea", "art", "baltimore aquarium", "maui", "jamaca", "movie", "sea water", "oriental restaurant", "bay", "smack", "oceans and seas", "deep ocean", "texas", "saltwater", "hand", "saardina", "underwater", "hawaii", "monterey bay", "florida", "tank", "oceanic trench", "chinese restaurant", "cuba", "jungle", "photographs", "warm ocean waters", "osean", "red sea", "store", "aquarium", "calm waters", "pacific ocean", "sea world", "bathing suit", "shore", "ocean water"], "input": {"head": "jellyfish", "relation": "AtLocation"}}
|
||||
{"generation": "feels a sense of accomplishment and connection with PersonY", "references": ["PersonX gets a drink for PersonY"], "input": {"head": "PersonX makes PersonY's laugh", "relation": "isBefore"}}
|
||||
{"generation": "bones", "references": ["mouse button", "mouse wheel"], "input": {"head": "mouse", "relation": "MadeUpOf"}}
|
||||
{"generation": "gallop at high speeds", "references": ["service mare", "cover mare"], "input": {"head": "stallion", "relation": "CapableOf"}}
|
||||
{"generation": "granular and composed of small particles", "references": ["found at beach", "found in desert", "dry and small", "found on beach", "played in", "black", "made into glass", "found on beaches", "white", "made up of tiny ground rocks", "gritty"], "input": {"head": "sand", "relation": "HasProperty"}}
|
||||
{"generation": "dead or diseased plants.", "references": ["plants to die"], "input": {"head": "gardener", "relation": "NotDesires"}}
|
||||
{"generation": "increased sweating and discomfort", "references": ["fainting"], "input": {"head": "hot weather", "relation": "Causes"}}
|
||||
{"generation": "they were feeling unwell or tired", "references": ["of cold", "were sick", "feel ill", "you're sick", "you're still tired", "don't feel well", "back hurts", "have sore throat"], "input": {"head": "stay in bed", "relation": "xReason"}}
|
||||
{"generation": "a safe place to lay eggs and find food", "references": ["bread crumbs"], "input": {"head": "hen", "relation": "Desires"}}
|
||||
@@ -0,0 +1,3 @@
|
||||
{"bleu1": 0.6444057248156898, "bleu2": 0.48373036211716114, "bleu3": 0.34595090028613795, "bleu4": 0.2680433075050665, "isFilledBy": 0.8571428571428571, "HasSubEvent": 1.0, "HinderedBy": 0.9184781335868804, "xAttr": 1.0, "oEffect": 0.5, "xReact": 0.4953587998572467, "ObjectUse": 1.0, "xEffect": 0.5925925925925926, "isAfter": 0.41333333333333333, "oReact": 0.42857142857142855, "xIntent": 0.5869565217391305, "xWant": 0.8260869565217391, "oWant": 0.9090909090909091, "AtLocation": 1.0, "isBefore": 0.423728813559322, "MadeUpOf": 0.301194211912202, "CapableOf": 0.3333333333333333, "HasProperty": 0.75, "NotDesires": 0.5, "Causes": 0.24242424242424243, "xReason": 0.6666666666666666, "Desires": 0.15789473684210523}
|
||||
bleu1 bleu2 bleu3 bleu4 isFilledBy HasSubEvent HinderedBy xAttr oEffect xReact ObjectUse xEffect isAfter oReact xIntent xWant oWant AtLocation isBefore MadeUpOf CapableOf HasProperty NotDesires Causes xReason Desires
|
||||
0.6444057248156898 0.48373036211716114 0.34595090028613795 0.2680433075050665 0.8571428571428571 1.0 0.9184781335868804 1.0 0.5 0.4953587998572467 1.0 0.5925925925925926 0.41333333333333333 0.42857142857142855 0.5869565217391305 0.8260869565217391 0.9090909090909091 1.0 0.423728813559322 0.301194211912202 0.3333333333333333 0.75 0.5 0.24242424242424243 0.6666666666666666 0.15789473684210523
|
||||
23
KG Reasoning/ATOMIC2020/system_eval/BART/chatgpt.json
Normal file
23
KG Reasoning/ATOMIC2020/system_eval/BART/chatgpt.json
Normal file
@@ -0,0 +1,23 @@
|
||||
{"head": "PersonX also saw ___", "relation": "isFilledBy", "tails": "birds", "generations": ["none"], "greedy": "none"}
|
||||
{"head": "wage war", "relation": "HasSubEvent", "tails": "cities destroyed", "generations": ["need to strategize and plan your actions carefully"], "greedy": "need to strategize and plan your actions carefully"}
|
||||
{"head": "PersonX goes from zero to hero", "relation": "xNeed", "tails": "assess a situation", "generations": ["need to strategize and plan your actions carefully"], "greedy": "need to strategize and plan your actions carefully"}
|
||||
{"head": "PersonX is walking on the beach", "relation": "xAttr", "tails": "alone", "generations": ["enjoying the view", "feeling the sand between their toes", "listening to the sound of the waves", "taking photos of the scenery", "looking for seashells"], "greedy": "enjoying the view"}
|
||||
{"head": "PersonX cuts open ___", "relation": "oEffect", "tails": "operated on", "generations": ["none"], "greedy": "none"}
|
||||
{"head": "PersonX becomes PersonY member", "relation": "xReact", "tails": "included", "generations": ["happy", "excited", "required", "hesitant", "cautious"], "greedy": "happy"}
|
||||
{"head": "kerosene", "relation": "ObjectUse", "tails": "become arsonist", "generations": ["fuel"], "greedy": "fuel"}
|
||||
{"head": "PersonX loves PersonX's dog", "relation": "xEffect", "tails": "he want get a dog", "generations": ["happiness", "joy"], "greedy": "happiness"}
|
||||
{"head": "PersonX stops kissing PersonY", "relation": "HinderedBy", "tails": "PersonY is crying out of control and PersonX is full of compassion.", "generations": ["they didn't feel uncomfortable or if they were not interrupted by an external factor such as a phone call or a sudden noise"], "greedy": "they didn't feel uncomfortable or if they were not interrupted by an external factor such as a phone call or a sudden noise"}
|
||||
{"head": "PersonX looks at PersonY like that", "relation": "isAfter", "tails": "PersonX is on a date with PersonY", "generations": ["talked to them", "smiled at them", "hugged them", "kissed them"], "greedy": "talked to them"}
|
||||
{"head": "PersonX quickly fell in love", "relation": "oReact", "tails": "happy.", "generations": ["happy for them"], "greedy": "happy for them"}
|
||||
{"head": "PersonX loves PersonX's work", "relation": "xIntent", "tails": "none", "generations": [" show appreciation"], "greedy": " show appreciation"}
|
||||
{"head": "PersonX takes PersonY everywhere", "relation": "xWant", "tails": "refill up with gas", "generations": ["spend more time with them"], "greedy": "spend more time with them"}
|
||||
{"head": "PersonX gets close to PersonY", "relation": "oWant", "tails": "to bond with x", "generations": ["get closer to PersonX and PersonY"], "greedy": "get closer to PersonX and PersonY"}
|
||||
{"head": "jellyfish", "relation": "AtLocation", "tails": "potato sout", "generations": ["the ocean", "the sea"], "greedy": "the ocean"}
|
||||
{"head": "PersonX makes PersonY's laugh", "relation": "isBefore", "tails": "PersonX gets a drink for PersonY", "generations": ["leaves the room", "tells a joke", "receives a phone call", "apologizes for something", "continues the conversation"], "greedy": "leaves the room"}
|
||||
{"head": "mouse", "relation": "MadeUpOf", "tails": "mouse button", "generations": ["cells"], "greedy": "cells"}
|
||||
{"head": "stallion", "relation": "CapableOf", "tails": "service mare", "generations": ["run fast"], "greedy": "run fast"}
|
||||
{"head": "sand", "relation": "HasProperty", "tails": "white", "generations": ["gritty"], "greedy": "gritty"}
|
||||
{"head": "gardener", "relation": "NotDesires", "tails": "plants to die", "generations": ["weeds", "pests", "to use harmful chemicals", "to work on weekends"], "greedy": "weeds"}
|
||||
{"head": "hot weather", "relation": "Causes", "tails": "fainting", "generations": ["dehydration", "heat exhaustion", "air pollution"], "greedy": "dehydration"}
|
||||
{"head": "stay in bed", "relation": "xReason", "tails": "of cold", "generations": ["they were feeling ill", "they wanted to get some rest", "it was a lazy Sunday morning"], "greedy": "they were feeling ill"}
|
||||
{"head": "hen", "relation": "Desires", "tails": "bread crumbs", "generations": ["a comfortable nesting area", "fresh grains and vegetables", "to flee or hide"], "greedy": "a comfortable nesting area"}
|
||||
23
KG Reasoning/ATOMIC2020/system_eval/BART/chatgpt_gens.jsonl
Normal file
23
KG Reasoning/ATOMIC2020/system_eval/BART/chatgpt_gens.jsonl
Normal file
@@ -0,0 +1,23 @@
|
||||
{"generation": "none", "references": ["the dolphin", "the whale", "monuments", "boss", "trees", "coworker", "the car", "flowers", "birds"], "input": {"head": "PersonX also saw ___", "relation": "isFilledBy"}}
|
||||
{"generation": "need to strategize and plan your actions carefully", "references": ["attack enemy", "people get killed", "reconsider all options", "cities destroyed", "fight", "people become underfed", "people protest", "write proclamation declaring war", "assination", "people become injured or killed", "become angry", "human lives lost", "victory", "prepare for battle", "people in armies would fight", "stop fighting", "bombs droped", "arms makers grow more wealthy", "killing enemy soldiers", "declare war", "win or lose", "attrition", "get bad", "defeat", "engage deception", "attack opposing armies", "innocent people get killed", "national borders change", "death", "people become refugees", "people become homeless", "negotiate peace", "may get into struggle"], "input": {"head": "wage war", "relation": "HasSubEvent"}}
|
||||
{"generation": "need to strategize and plan your actions carefully", "references": ["PersonX is a wimp", "PersonX got sick from Scott", "PersonX has been knocked out by his girlfriend", "PersonX does not have the courage to confront the situation.", "PersonX does not have the right equipment to defeat the enemy.", "PersonX does not have a gym membership", "PersonX has no skills", "PersonX is afraid", "PersonX was made too weak by his mom's diet"], "input": {"head": "PersonX goes from zero to hero", "relation": "HinderedBy"}}
|
||||
{"generation": "enjoying the view", "references": ["content", "healthy", "relaxed", "alone", "restive", "blissful", "care-free"], "input": {"head": "PersonX is walking on the beach", "relation": "xAttr"}}
|
||||
{"generation": "none", "references": ["rushed to hospital", "none", "operated on"], "input": {"head": "PersonX cuts open ___", "relation": "oEffect"}}
|
||||
{"generation": "happy", "references": ["included"], "input": {"head": "PersonX becomes PersonY member", "relation": "xReact"}}
|
||||
{"generation": "fuel", "references": ["create heat", "kill head lice", "soak a rag", "clean paint off a wall", "clean off grease", "burn someone's house who cheated on you", "remove a messbecom", "disinfect a wound", "become arsonist"], "input": {"head": "kerosene", "relation": "ObjectUse"}}
|
||||
{"generation": "happiness", "references": ["want to feed a dog", "none", "he want get a dog", "he interested a pets", "he jalouse for another"], "input": {"head": "PersonX loves PersonX's dog", "relation": "xEffect"}}
|
||||
{"generation": "they didn't feel uncomfortable or if they were not interrupted by an external factor such as a phone call or a sudden noise", "references": ["PersonX is out of control and has become overcome by his sexual desires.", "PersonY is holding PersonX down.", "They are in love with Person Y", "PersonY will leave PersonX.", "They are feeling the passion of the moment.", "PersonY is crying out of control and PersonX is full of compassion."], "input": {"head": "PersonX stops kissing PersonY", "relation": "HinderedBy"}}
|
||||
{"generation": "talked to them", "references": ["PersonX is on a date with PersonY"], "input": {"head": "PersonX looks at PersonY like that", "relation": "isAfter"}}
|
||||
{"generation": "happy for them", "references": ["glad", "overjoyed", "happy."], "input": {"head": "PersonX quickly fell in love", "relation": "oReact"}}
|
||||
{"generation": " show appreciation", "references": ["to be productive.", "to be accomplished", "have a job they love", "none", "to succeed."], "input": {"head": "PersonX loves PersonX's work", "relation": "xIntent"}}
|
||||
{"generation": "spend more time with them", "references": ["take person Y home", "to eat dinner", "to get PersonY familiar with the area", "to rest", "refill up with gas", "to answer PersonY's questions"], "input": {"head": "PersonX takes PersonY everywhere", "relation": "xWant"}}
|
||||
{"generation": "get closer to PersonX and PersonY", "references": ["to spend time with X", "to marry X", "to bond with x", "to reciprocate the feelings", "to avoid x", "to spend as much time with them as they can"], "input": {"head": "PersonX gets close to PersonY", "relation": "oWant"}}
|
||||
{"generation": "the ocean", "references": ["ocean", "potato sout", "all oceans of world", "most oceans", "surf", "sea at coral reefs", "salt water", "tidal waters", "shores washed up and dead", "thai restaurant", "monterey bay aquarium", "north sea", "warm ocean", "peanut butter pool", "coral reef", "marine aquarium", "thesand", "public aquarium", "chesapeake bay", "restaurant with strange menu", "mediterranean sea", "open ocean", "zoo", "pervet's bedroom", "tropical waters", "aqurium", "see", "jelly bean", "warm ocean water", "detroit zoo", "book", "ocea", "sandwith", "gulf", "sushi restaurant", "encyclopedia", "tidal pools", "current", "pond", "weirdest places", "cartoon", "penny candy aisle", "tropical body of water", "international waters", "lake", "atlantic ocean", "ocean or aquarium", "japanese restaurant", "chinese entree", "warm sea", "art", "baltimore aquarium", "maui", "jamaca", "movie", "sea water", "oriental restaurant", "bay", "smack", "oceans and seas", "deep ocean", "texas", "saltwater", "hand", "saardina", "underwater", "hawaii", "monterey bay", "florida", "tank", "oceanic trench", "chinese restaurant", "cuba", "jungle", "photographs", "warm ocean waters", "osean", "red sea", "store", "aquarium", "calm waters", "pacific ocean", "sea world", "bathing suit", "shore", "ocean water"], "input": {"head": "jellyfish", "relation": "AtLocation"}}
|
||||
{"generation": "leaves the room", "references": ["PersonX gets a drink for PersonY"], "input": {"head": "PersonX makes PersonY's laugh", "relation": "isBefore"}}
|
||||
{"generation": "cells", "references": ["mouse button", "mouse wheel"], "input": {"head": "mouse", "relation": "MadeUpOf"}}
|
||||
{"generation": "run fast", "references": ["service mare", "cover mare"], "input": {"head": "stallion", "relation": "CapableOf"}}
|
||||
{"generation": "gritty", "references": ["found at beach", "found in desert", "dry and small", "found on beach", "played in", "black", "made into glass", "found on beaches", "white", "made up of tiny ground rocks", "gritty"], "input": {"head": "sand", "relation": "HasProperty"}}
|
||||
{"generation": "weeds", "references": ["plants to die"], "input": {"head": "gardener", "relation": "NotDesires"}}
|
||||
{"generation": "dehydration", "references": ["fainting"], "input": {"head": "hot weather", "relation": "Causes"}}
|
||||
{"generation": "they were feeling ill", "references": ["of cold", "were sick", "feel ill", "you're sick", "you're still tired", "don't feel well", "back hurts", "have sore throat"], "input": {"head": "stay in bed", "relation": "xReason"}}
|
||||
{"generation": "a comfortable nesting area", "references": ["bread crumbs"], "input": {"head": "hen", "relation": "Desires"}}
|
||||
@@ -0,0 +1,3 @@
|
||||
{"bleu1": 0.5569510572746097, "bleu2": 0.3954894422517138, "bleu3": 0.24265701174375645, "bleu4": 0.1838364317541617, "HasSubEvent": 0.82, "HinderedBy": 0.7548780487804878, "xAttr": 0.6470588235294118, "xReact": 0.0, "ObjectUse": 0.22313016014842982, "xEffect": 0.8888888888888888, "isAfter": 0.18385367289469146, "oReact": 0.5714285714285714, "xIntent": 0.8333333333333334, "xWant": 0.7839722858489145, "oWant": 0.7878787878787878, "AtLocation": 1.0, "isBefore": 0.23610273246096233, "MadeUpOf": 0.1807165271473212, "CapableOf": 0.38940039153570244, "HasProperty": 1.0, "NotDesires": 0.12113791079679323, "Causes": 0.36363636363636365, "xReason": 0.8095238095238095, "Desires": 0.3461538461538461}
|
||||
bleu1 bleu2 bleu3 bleu4 HasSubEvent HinderedBy xAttr xReact ObjectUse xEffect isAfter oReact xIntent xWant oWant AtLocation isBefore MadeUpOf CapableOf HasProperty NotDesires Causes xReason Desires
|
||||
0.5569510572746097 0.3954894422517138 0.24265701174375645 0.1838364317541617 0.82 0.7548780487804878 0.6470588235294118 0.0 0.22313016014842982 0.8888888888888888 0.18385367289469146 0.5714285714285714 0.8333333333333334 0.7839722858489145 0.7878787878787878 1.0 0.23610273246096233 0.1807165271473212 0.38940039153570244 1.0 0.12113791079679323 0.36363636363636365 0.8095238095238095 0.3461538461538461
|
||||
23
KG Reasoning/ATOMIC2020/system_eval/BART/example.tsv
Normal file
23
KG Reasoning/ATOMIC2020/system_eval/BART/example.tsv
Normal file
@@ -0,0 +1,23 @@
|
||||
PersonX also saw ___ isFilledBy birds none
|
||||
wage war HasSubEvent cities destroyed none
|
||||
PersonX goes from zero to hero xNeed assess a situation put in hard work and dedication to achieve their goals.
|
||||
PersonX is walking on the beach xAttr alone enjoying the scenery and the peaceful atmosphere.
|
||||
PersonX cuts open ___ oEffect operated on unclear
|
||||
PersonX becomes PersonY member xReact included happy and possibly eager to participate in activities or events associated with the membership.
|
||||
kerosene ObjectUse become arsonist fueling lamps, stoves, and heaters.
|
||||
PersonX loves PersonX's dog xEffect he want get a dog feeling happy and fulfilled,having a stronger bond with their pet.
|
||||
PersonX stops kissing PersonY HinderedBy PersonY is crying out of control and PersonX is full of compassion. they both wanted to continue
|
||||
PersonX looks at PersonY like that isAfter PersonX is on a date with PersonY unknown
|
||||
PersonX quickly fell in love oReact happy. happy
|
||||
PersonX loves PersonX's work xIntent none to feel fulfilled and accomplished.
|
||||
PersonX takes PersonY everywhere xWant refill up with gas spend some quality time with PersonY.
|
||||
PersonX gets close to PersonY oWant to bond with x get to know PersonY better.
|
||||
jellyfish AtLocation potato sout the ocean, sea.
|
||||
PersonX makes PersonY's laugh isBefore PersonX gets a drink for PersonY feels happy ,satisfied.
|
||||
mouse MadeUpOf mouse button cells ,organs.
|
||||
stallion CapableOf service mare mate with multiple mares,run very fast
|
||||
sand HasProperty white gritty , granular.
|
||||
gardener NotDesires plants to die to see their plants die or wither.
|
||||
hot weather Causes fainting dehydration , sweating.
|
||||
stay in bed xReason of cold they were feeling unwell or tired.
|
||||
hen Desires bread crumbs to lay eggs ,to find a safe place to roost.
|
||||
|
13
KG Reasoning/ATOMIC2020/system_eval/BART/gen_json.py
Normal file
13
KG Reasoning/ATOMIC2020/system_eval/BART/gen_json.py
Normal file
@@ -0,0 +1,13 @@
|
||||
import json
|
||||
f =open("example.tsv")
|
||||
fw=open("1_CHATGPT.json",'w')
|
||||
for it in f.readlines():
|
||||
i=it.strip().split('\t')
|
||||
print(i)
|
||||
js={}
|
||||
js['head']=i[0]
|
||||
js['relation'] =i[1]
|
||||
js['tails']=i[2].split(' ')[0]
|
||||
js['generations']=[i for i in i[2].split(' ')[1].split(',')]
|
||||
js['greedy']=i[2].split(' ')[1].split(',')[0]
|
||||
fw.write(json.dumps(js)+'\n')
|
||||
23
KG Reasoning/ATOMIC2020/system_eval/BART/text003.json
Normal file
23
KG Reasoning/ATOMIC2020/system_eval/BART/text003.json
Normal file
@@ -0,0 +1,23 @@
|
||||
{"head": "PersonX also saw ___", "relation": "isFilledBy", "tails": "birds", "generations": ["none"], "greedy": "none"}
|
||||
{"head": "wage war", "relation": "HasSubEvent", "tails": "cities destroyed", "generations": ["none"], "greedy": "none"}
|
||||
{"head": "PersonX goes from zero to hero", "relation": "xNeed", "tails": "assess a situation", "generations": ["develop a plan and take action."], "greedy": "develop a plan and take action."}
|
||||
{"head": "PersonX is walking on the beach", "relation": "xAttr", "tails": "alone", "generations": ["enjoying the sunshine"], "greedy": "enjoying the sunshine"}
|
||||
{"head": "PersonX cuts open ___", "relation": "oEffect", "tails": "operated on", "generations": [" a feeling of surprise"], "greedy": " a feeling of surprise"}
|
||||
{"head": "PersonX becomes PersonY member", "relation": "xReact", "tails": "included", "generations": ["eligible to receive benefits from xReact"], "greedy": "eligible to receive benefits from xReact"}
|
||||
{"head": "kerosene", "relation": "ObjectUse", "tails": "become arsonist", "generations": ["Heating"], "greedy": "Heating"}
|
||||
{"head": "PersonX loves PersonX's dog", "relation": "xEffect", "tails": "he want get a dog", "generations": ["an increased feeling of love and connection"], "greedy": "an increased feeling of love and connection"}
|
||||
{"head": "PersonX stops kissing PersonY", "relation": "HinderedBy", "tails": "PersonY is crying out of control and PersonX is full of compassion.", "generations": ["PersonY was not uncomfortable or did not want the kiss to end"], "greedy": "PersonY was not uncomfortable or did not want the kiss to end"}
|
||||
{"head": "PersonX looks at PersonY like that", "relation": "isAfter", "tails": "PersonX is on a date with PersonY", "generations": [" had been smiling at PersonY"], "greedy": " had been smiling at PersonY"}
|
||||
{"head": "PersonX quickly fell in love", "relation": "oReact", "tails": "happy.", "generations": ["envious"], "greedy": "envious"}
|
||||
{"head": "PersonX loves PersonX's work", "relation": "xIntent", "tails": "none", "generations": [" express their appreciation"], "greedy": " express their appreciation"}
|
||||
{"head": "PersonX takes PersonY everywhere", "relation": "xWant", "tails": "refill up with gas", "generations": ["go home"], "greedy": "go home"}
|
||||
{"head": "PersonX gets close to PersonY", "relation": "oWant", "tails": "to bond with x", "generations": ["do the same"], "greedy": "do the same"}
|
||||
{"head": "jellyfish", "relation": "AtLocation", "tails": "potato sout", "generations": ["the ocean"], "greedy": "the ocean"}
|
||||
{"head": "PersonX makes PersonY's laugh", "relation": "isBefore", "tails": "PersonX gets a drink for PersonY", "generations": ["continues to make PersonY laugh"], "greedy": "continues to make PersonY laugh"}
|
||||
{"head": "mouse", "relation": "MadeUpOf", "tails": "mouse button", "generations": ["bones", "organs", "muscles", "and other tissues"], "greedy": "bones"}
|
||||
{"head": "stallion", "relation": "CapableOf", "tails": "service mare", "generations": ["run"], "greedy": "run"}
|
||||
{"head": "sand", "relation": "HasProperty", "tails": "white", "generations": ["Grainy"], "greedy": "Grainy"}
|
||||
{"head": "gardener", "relation": "NotDesires", "tails": "plants to die", "generations": ["to work"], "greedy": "to work"}
|
||||
{"head": "hot weather", "relation": "Causes", "tails": "fainting", "generations": ["Sweating"], "greedy": "Sweating"}
|
||||
{"head": "stay in bed", "relation": "xReason", "tails": "of cold", "generations": ["they were feeling ill"], "greedy": "they were feeling ill"}
|
||||
{"head": "hen", "relation": "Desires", "tails": "bread crumbs", "generations": ["eggs"], "greedy": "eggs"}
|
||||
23
KG Reasoning/ATOMIC2020/system_eval/BART/text003_gens.jsonl
Normal file
23
KG Reasoning/ATOMIC2020/system_eval/BART/text003_gens.jsonl
Normal file
@@ -0,0 +1,23 @@
|
||||
{"generation": "none", "references": ["the dolphin", "the whale", "monuments", "boss", "trees", "coworker", "the car", "flowers", "birds"], "input": {"head": "PersonX also saw ___", "relation": "isFilledBy"}}
|
||||
{"generation": "none", "references": ["attack enemy", "people get killed", "reconsider all options", "cities destroyed", "fight", "people become underfed", "people protest", "write proclamation declaring war", "assination", "people become injured or killed", "become angry", "human lives lost", "victory", "prepare for battle", "people in armies would fight", "stop fighting", "bombs droped", "arms makers grow more wealthy", "killing enemy soldiers", "declare war", "win or lose", "attrition", "get bad", "defeat", "engage deception", "attack opposing armies", "innocent people get killed", "national borders change", "death", "people become refugees", "people become homeless", "negotiate peace", "may get into struggle"], "input": {"head": "wage war", "relation": "HasSubEvent"}}
|
||||
{"generation": "develop a plan and take action.", "references": ["PersonX is a wimp", "PersonX got sick from Scott", "PersonX has been knocked out by his girlfriend", "PersonX does not have the courage to confront the situation.", "PersonX does not have the right equipment to defeat the enemy.", "PersonX does not have a gym membership", "PersonX has no skills", "PersonX is afraid", "PersonX was made too weak by his mom's diet"], "input": {"head": "PersonX goes from zero to hero", "relation": "HinderedBy"}}
|
||||
{"generation": "enjoying the sunshine", "references": ["content", "healthy", "relaxed", "alone", "restive", "blissful", "care-free"], "input": {"head": "PersonX is walking on the beach", "relation": "xAttr"}}
|
||||
{"generation": " a feeling of surprise", "references": ["rushed to hospital", "none", "operated on"], "input": {"head": "PersonX cuts open ___", "relation": "oEffect"}}
|
||||
{"generation": "eligible to receive benefits from xReact", "references": ["included"], "input": {"head": "PersonX becomes PersonY member", "relation": "xReact"}}
|
||||
{"generation": "Heating", "references": ["create heat", "kill head lice", "soak a rag", "clean paint off a wall", "clean off grease", "burn someone's house who cheated on you", "remove a messbecom", "disinfect a wound", "become arsonist"], "input": {"head": "kerosene", "relation": "ObjectUse"}}
|
||||
{"generation": "an increased feeling of love and connection", "references": ["want to feed a dog", "none", "he want get a dog", "he interested a pets", "he jalouse for another"], "input": {"head": "PersonX loves PersonX's dog", "relation": "xEffect"}}
|
||||
{"generation": "PersonY was not uncomfortable or did not want the kiss to end", "references": ["PersonX is out of control and has become overcome by his sexual desires.", "PersonY is holding PersonX down.", "They are in love with Person Y", "PersonY will leave PersonX.", "They are feeling the passion of the moment.", "PersonY is crying out of control and PersonX is full of compassion."], "input": {"head": "PersonX stops kissing PersonY", "relation": "HinderedBy"}}
|
||||
{"generation": " had been smiling at PersonY", "references": ["PersonX is on a date with PersonY"], "input": {"head": "PersonX looks at PersonY like that", "relation": "isAfter"}}
|
||||
{"generation": "envious", "references": ["glad", "overjoyed", "happy."], "input": {"head": "PersonX quickly fell in love", "relation": "oReact"}}
|
||||
{"generation": " express their appreciation", "references": ["to be productive.", "to be accomplished", "have a job they love", "none", "to succeed."], "input": {"head": "PersonX loves PersonX's work", "relation": "xIntent"}}
|
||||
{"generation": "go home", "references": ["take person Y home", "to eat dinner", "to get PersonY familiar with the area", "to rest", "refill up with gas", "to answer PersonY's questions"], "input": {"head": "PersonX takes PersonY everywhere", "relation": "xWant"}}
|
||||
{"generation": "do the same", "references": ["to spend time with X", "to marry X", "to bond with x", "to reciprocate the feelings", "to avoid x", "to spend as much time with them as they can"], "input": {"head": "PersonX gets close to PersonY", "relation": "oWant"}}
|
||||
{"generation": "the ocean", "references": ["ocean", "potato sout", "all oceans of world", "most oceans", "surf", "sea at coral reefs", "salt water", "tidal waters", "shores washed up and dead", "thai restaurant", "monterey bay aquarium", "north sea", "warm ocean", "peanut butter pool", "coral reef", "marine aquarium", "thesand", "public aquarium", "chesapeake bay", "restaurant with strange menu", "mediterranean sea", "open ocean", "zoo", "pervet's bedroom", "tropical waters", "aqurium", "see", "jelly bean", "warm ocean water", "detroit zoo", "book", "ocea", "sandwith", "gulf", "sushi restaurant", "encyclopedia", "tidal pools", "current", "pond", "weirdest places", "cartoon", "penny candy aisle", "tropical body of water", "international waters", "lake", "atlantic ocean", "ocean or aquarium", "japanese restaurant", "chinese entree", "warm sea", "art", "baltimore aquarium", "maui", "jamaca", "movie", "sea water", "oriental restaurant", "bay", "smack", "oceans and seas", "deep ocean", "texas", "saltwater", "hand", "saardina", "underwater", "hawaii", "monterey bay", "florida", "tank", "oceanic trench", "chinese restaurant", "cuba", "jungle", "photographs", "warm ocean waters", "osean", "red sea", "store", "aquarium", "calm waters", "pacific ocean", "sea world", "bathing suit", "shore", "ocean water"], "input": {"head": "jellyfish", "relation": "AtLocation"}}
|
||||
{"generation": "continues to make PersonY laugh", "references": ["PersonX gets a drink for PersonY"], "input": {"head": "PersonX makes PersonY's laugh", "relation": "isBefore"}}
|
||||
{"generation": "bones", "references": ["mouse button", "mouse wheel"], "input": {"head": "mouse", "relation": "MadeUpOf"}}
|
||||
{"generation": "run", "references": ["service mare", "cover mare"], "input": {"head": "stallion", "relation": "CapableOf"}}
|
||||
{"generation": "Grainy", "references": ["found at beach", "found in desert", "dry and small", "found on beach", "played in", "black", "made into glass", "found on beaches", "white", "made up of tiny ground rocks", "gritty"], "input": {"head": "sand", "relation": "HasProperty"}}
|
||||
{"generation": "to work", "references": ["plants to die"], "input": {"head": "gardener", "relation": "NotDesires"}}
|
||||
{"generation": "Sweating", "references": ["fainting"], "input": {"head": "hot weather", "relation": "Causes"}}
|
||||
{"generation": "they were feeling ill", "references": ["of cold", "were sick", "feel ill", "you're sick", "you're still tired", "don't feel well", "back hurts", "have sore throat"], "input": {"head": "stay in bed", "relation": "xReason"}}
|
||||
{"generation": "eggs", "references": ["bread crumbs"], "input": {"head": "hen", "relation": "Desires"}}
|
||||
@@ -0,0 +1,3 @@
|
||||
{"bleu1": 0.6033196140167023, "bleu2": 0.42710810277634487, "bleu3": 0.3028772328842837, "bleu4": 0.20672574828575693, "HinderedBy": 0.8602006786524257, "xAttr": 0.6666666666666666, "oEffect": 0.6363636363636364, "xReact": 0.12500000000000003, "ObjectUse": 0.558376335026619, "xEffect": 0.5813953488372093, "isAfter": 0.7169694062511285, "oReact": 0.42857142857142855, "xIntent": 0.6666666666666666, "xWant": 1.0, "oWant": 1.0, "AtLocation": 1.0, "isBefore": 0.7183839862680628, "MadeUpOf": 0.301194211912202, "CapableOf": 0.03232398928813501, "HasProperty": 0.8333333333333334, "NotDesires": 0.18187407671869282, "Causes": 0.625, "xReason": 0.8095238095238095, "Desires": 0.06766764161830635}
|
||||
bleu1 bleu2 bleu3 bleu4 HinderedBy xAttr oEffect xReact ObjectUse xEffect isAfter oReact xIntent xWant oWant AtLocation isBefore MadeUpOf CapableOf HasProperty NotDesires Causes xReason Desires
|
||||
0.6033196140167023 0.42710810277634487 0.3028772328842837 0.20672574828575693 0.8602006786524257 0.6666666666666666 0.6363636363636364 0.12500000000000003 0.558376335026619 0.5813953488372093 0.7169694062511285 0.42857142857142855 0.6666666666666666 1.0 1.0 1.0 0.7183839862680628 0.301194211912202 0.03232398928813501 0.8333333333333334 0.18187407671869282 0.625 0.8095238095238095 0.06766764161830635
|
||||
235
KG Reasoning/ATOMIC2020/system_eval/automatic_eval.py
Normal file
235
KG Reasoning/ATOMIC2020/system_eval/automatic_eval.py
Normal file
@@ -0,0 +1,235 @@
|
||||
import argparse
|
||||
import numpy as np
|
||||
from nltk.translate.bleu_score import sentence_bleu
|
||||
from utils import read_jsonl, remove_prefix, write_jsonl
|
||||
from evaluation.eval import QGEvalCap
|
||||
from tabulate import tabulate
|
||||
import json
|
||||
import os
|
||||
from collections import defaultdict
|
||||
import random
|
||||
|
||||
def get_reference_sentences(filename):
|
||||
result = []
|
||||
with open(filename) as file:
|
||||
for line in file:
|
||||
result.append([x.strip() for x in line.split('\t')[1].split('|')])
|
||||
return result
|
||||
|
||||
def postprocess(sentence):
|
||||
return sentence
|
||||
|
||||
def get_heads_and_relations(filename):
|
||||
result = []
|
||||
with open(filename) as file:
|
||||
for line in file:
|
||||
line = line.split('\t')[0]
|
||||
head_event = line.split('@@')[0].strip()
|
||||
relation = line.split('@@')[1].strip()
|
||||
to_add = {
|
||||
'head': head_event,
|
||||
'relation': relation
|
||||
}
|
||||
result.append(to_add)
|
||||
return result
|
||||
|
||||
def get_hypothesises(filename):
|
||||
result = []
|
||||
import json
|
||||
|
||||
with open(filename) as file:
|
||||
for line in file:
|
||||
result.append(json.loads(line)["greedy"])
|
||||
return result
|
||||
|
||||
def preprocess_generations(args):
|
||||
input_file = args.input_file
|
||||
|
||||
outfile_path = os.path.join(os.path.dirname(input_file), os.path.basename(input_file).split('.')[0] + "_gens.jsonl")
|
||||
|
||||
outfile = open(outfile_path, 'w')
|
||||
|
||||
references_list = get_reference_sentences('t.tsv')
|
||||
heads_relations = get_heads_and_relations('t.tsv')
|
||||
hypothesises = get_hypothesises(args.input_file)
|
||||
|
||||
idx = 0
|
||||
|
||||
total_bleu_1 = 0
|
||||
total_bleu_2 = 0
|
||||
total_bleu_3 = 0
|
||||
total_bleu_4 = 0
|
||||
|
||||
relation_bleu_1 = defaultdict(lambda: defaultdict(int))
|
||||
|
||||
count = 0
|
||||
|
||||
for head_relation, references, hypothesis in zip(heads_relations, references_list, hypothesises):
|
||||
bleu_1 = sentence_bleu(references, hypothesis, weights=[1.0])
|
||||
bleu_2 = sentence_bleu(references, hypothesis, weights=[0.5, 0.5])
|
||||
bleu_3 = sentence_bleu(references, hypothesis, weights=[0.34, 0.33, 0.33])
|
||||
bleu_4 = sentence_bleu(references, hypothesis)
|
||||
|
||||
result = {
|
||||
'generation': postprocess(hypothesis),
|
||||
'references': [postprocess(reference) for reference in references],
|
||||
'input': head_relation
|
||||
}
|
||||
if hypothesis != 'none':
|
||||
total_bleu_1 += bleu_1
|
||||
total_bleu_2 += bleu_2
|
||||
total_bleu_3 += bleu_3
|
||||
total_bleu_4 += bleu_4
|
||||
|
||||
relation_bleu_1[head_relation["relation"]]["total"] += bleu_1
|
||||
relation_bleu_1[head_relation["relation"]]["count"] += 1
|
||||
|
||||
count += 1
|
||||
|
||||
outfile.write(json.dumps(result) + "\n")
|
||||
print('gens non-none', count)
|
||||
outfile_scores = open(os.path.join(os.path.dirname(input_file), os.path.basename(input_file).split('.')[0] + "_scores.jsonl"), 'w')
|
||||
|
||||
summary = {
|
||||
'bleu1': total_bleu_1 / count,
|
||||
'bleu2': total_bleu_2 / count,
|
||||
'bleu3': total_bleu_3 / count,
|
||||
'bleu4': total_bleu_4 / count
|
||||
}
|
||||
|
||||
for relation in relation_bleu_1:
|
||||
summary[relation] = relation_bleu_1[relation]["total"] / relation_bleu_1[relation]["count"]
|
||||
|
||||
outfile_scores.write(json.dumps(summary) + "\n")
|
||||
excel_str = ""
|
||||
for key in summary:
|
||||
excel_str += str(key) + '\t'
|
||||
outfile_scores.write(excel_str.strip())
|
||||
outfile_scores.write("\n")
|
||||
excel_str = ""
|
||||
for key in summary:
|
||||
excel_str += str(summary[key]) + '\t'
|
||||
|
||||
outfile_scores.write(excel_str.strip())
|
||||
|
||||
print(f"Saved gens in {outfile_path}")
|
||||
|
||||
return(os.path.abspath(outfile_path))
|
||||
|
||||
def get_tuple(l):
|
||||
gens = [l["generation"]]
|
||||
head = l["input"]["head"]
|
||||
tails = l["references"]
|
||||
relation = l["input"]["relation"]
|
||||
return {"head": head, "relation": relation, "tails": tails, "generations": gens}
|
||||
|
||||
def get2(l):
|
||||
return list(zip(*l))[1]
|
||||
|
||||
def topk_eval(model_name, data, k):
|
||||
|
||||
topk_gts = {}
|
||||
topk_res = {}
|
||||
instances = []
|
||||
topk_exact_match = []
|
||||
topk_exact_match_not_none = []
|
||||
topk_bleu_score = []
|
||||
|
||||
topk_is_head = []
|
||||
|
||||
for i, l in enumerate(data):
|
||||
t = get_tuple(l)
|
||||
gens = t["generations"]
|
||||
tails = t["tails"]
|
||||
head = t["head"]
|
||||
|
||||
for (j, g) in enumerate(gens[:k]):
|
||||
|
||||
instance = t.copy()
|
||||
instance["generation"] = g
|
||||
instances.append(instance)
|
||||
|
||||
key = str(i) + "_" + str(j)
|
||||
topk_gts[key] = tails
|
||||
topk_res[key] = [g]
|
||||
|
||||
if g in tails:
|
||||
topk_exact_match.append((l, 1))
|
||||
if g != "none":
|
||||
topk_exact_match_not_none.append((l, 1))
|
||||
else:
|
||||
topk_exact_match.append((l, 0))
|
||||
if g != "none":
|
||||
topk_exact_match_not_none.append((l, 0))
|
||||
if g == head:
|
||||
topk_is_head.append((l, 1))
|
||||
else:
|
||||
topk_is_head.append((l, 0))
|
||||
|
||||
QGEval = QGEvalCap(model_name, topk_gts, topk_res)
|
||||
score, scores = QGEval.evaluate()
|
||||
|
||||
return score, scores, instances
|
||||
|
||||
|
||||
def eval(data_file, model_name):
|
||||
|
||||
data = read_jsonl(data_file)
|
||||
|
||||
if len(data) == 0:
|
||||
return None
|
||||
|
||||
return topk_eval(model_name, data, k=1)
|
||||
|
||||
def toRow(name, results, columns):
|
||||
return [name] + [format(float(results[c]), '#.3f') for c in columns]
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument('--input_file', type=str, help='Results file on ATOMIC2020 test set')
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
generations_file = preprocess_generations(args)
|
||||
|
||||
input_file = generations_file
|
||||
|
||||
expts = [
|
||||
[input_file, os.path.basename(input_file).split('.')[0]]
|
||||
]
|
||||
|
||||
scores_per_model = []
|
||||
add_column = True
|
||||
for f, m in expts:
|
||||
result_file = './results/{}_scores.jsonl'.format(m)
|
||||
|
||||
s, scores, instances = eval(f, model_name=m)
|
||||
if s == None:
|
||||
print("Skipping ", m)
|
||||
continue
|
||||
|
||||
|
||||
for k in scores.keys():
|
||||
assert len(scores[k]) == len(instances)
|
||||
|
||||
results = {"model": m, "scores": s, "all_scores": scores, "instances": instances}
|
||||
write_jsonl(result_file, [results])
|
||||
|
||||
scores_per_model.append(results)
|
||||
columns = list(results["scores"].keys())
|
||||
s_row = toRow(results["model"], results["scores"], columns)
|
||||
if add_column:
|
||||
rows = [[""] + columns]
|
||||
add_column = False
|
||||
rows.append(s_row)
|
||||
|
||||
import datetime
|
||||
date = datetime.datetime.now().strftime("%Y%m%d%H%M%S")
|
||||
print(scores_per_model)
|
||||
|
||||
write_jsonl('./results/scores_{}.jsonl'.format(date), scores_per_model)
|
||||
print(tabulate(rows, headers='firstrow', tablefmt='latex', floatfmt='#.3f'))
|
||||
print(tabulate(rows, tablefmt='tsv', floatfmt='#.3f'))
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
21
KG Reasoning/ATOMIC2020/system_eval/evaluation/LICENSE
Normal file
21
KG Reasoning/ATOMIC2020/system_eval/evaluation/LICENSE
Normal file
@@ -0,0 +1,21 @@
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2018 Manish Joshi
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
2
KG Reasoning/ATOMIC2020/system_eval/evaluation/README.md
Normal file
2
KG Reasoning/ATOMIC2020/system_eval/evaluation/README.md
Normal file
@@ -0,0 +1,2 @@
|
||||
# qgeval
|
||||
Please use `system_eval/automatic_eval.py` for evaluation. This repo is taken from: https://github.com/allenai/alpha-nli/
|
||||
@@ -0,0 +1,42 @@
|
||||
from bert_score import score
|
||||
# Code for BertScore reused from original implementation: https://github.com/Tiiiger/bert_score
|
||||
|
||||
class BertScore:
|
||||
def __init__(self):
|
||||
self._hypo_for_image = {}
|
||||
self.ref_for_image = {}
|
||||
|
||||
def compute_score(self, gts, res):
|
||||
|
||||
assert(gts.keys() == res.keys())
|
||||
imgIds = gts.keys()
|
||||
|
||||
hyp_input = []
|
||||
ref_input = []
|
||||
same_indices = []
|
||||
for id in imgIds:
|
||||
hypo = res[id]
|
||||
ref = gts[id]
|
||||
|
||||
# Sanity check.
|
||||
assert(type(hypo) is list)
|
||||
assert(len(hypo) == 1)
|
||||
assert(type(ref) is list)
|
||||
assert(len(ref) >= 1)
|
||||
|
||||
hyp_input += [hypo[0]] * len(ref)
|
||||
ref_input += ref
|
||||
same_indices.append(len(ref_input))
|
||||
|
||||
p, r, f_scores = score(hyp_input, ref_input, model_type="bert-base-uncased")
|
||||
|
||||
prev_idx = 0
|
||||
aggreg_f1_scores = []
|
||||
for idx in same_indices:
|
||||
aggreg_f1_scores.append(f_scores[prev_idx: idx].mean().cpu().item())
|
||||
prev_idx = idx
|
||||
|
||||
return sum(aggreg_f1_scores)/len(aggreg_f1_scores), aggreg_f1_scores
|
||||
|
||||
def method(self):
|
||||
return "Bert Score"
|
||||
@@ -0,0 +1,144 @@
|
||||
import os
|
||||
import time
|
||||
import argparse
|
||||
import torch
|
||||
from collections import defaultdict
|
||||
from pytorch_pretrained_bert import BertTokenizer, BertModel, BertForMaskedLM
|
||||
import matplotlib
|
||||
import matplotlib.pyplot as plt
|
||||
import numpy as np
|
||||
|
||||
from .utils import get_idf_dict, bert_cos_score_idf,\
|
||||
get_bert_embedding, bert_types
|
||||
|
||||
__all__ = ['score', 'plot_example']
|
||||
|
||||
def score(cands, refs, bert="bert-base-multilingual-cased",
|
||||
num_layers=8, verbose=False, no_idf=False, batch_size=64):
|
||||
"""
|
||||
BERTScore metric.
|
||||
Args:
|
||||
- :param: `cands` (list of str): candidate sentences
|
||||
- :param: `refs` (list of str): reference sentences
|
||||
- :param: `bert` (str): bert specification
|
||||
- :param: `num_layers` (int): the layer of representation to use
|
||||
- :param: `verbose` (bool): turn on intermediate status update
|
||||
- :param: `no_idf` (bool): do not use idf weighting
|
||||
- :param: `batch_size` (int): bert score processing batch size
|
||||
"""
|
||||
assert len(cands) == len(refs)
|
||||
assert bert in bert_types
|
||||
|
||||
tokenizer = BertTokenizer.from_pretrained(bert)
|
||||
model = BertModel.from_pretrained(bert)
|
||||
model.eval()
|
||||
device = 'cuda' if torch.cuda.is_available() else 'cpu'
|
||||
model.to(device)
|
||||
|
||||
# drop unused layers
|
||||
model.encoder.layer = torch.nn.ModuleList([layer for layer in model.encoder.layer[:num_layers]])
|
||||
|
||||
if no_idf:
|
||||
idf_dict = defaultdict(lambda: 1.)
|
||||
# set idf for [SEP] and [CLS] to 0
|
||||
idf_dict[101] = 0
|
||||
idf_dict[102] = 0
|
||||
else:
|
||||
if verbose:
|
||||
print('preparing IDF dict...')
|
||||
start = time.perf_counter()
|
||||
idf_dict = get_idf_dict(refs, tokenizer)
|
||||
if verbose:
|
||||
print('done in {:.2f} seconds'.format(time.perf_counter() - start))
|
||||
|
||||
if verbose:
|
||||
print('calculating scores...')
|
||||
start = time.perf_counter()
|
||||
all_preds = bert_cos_score_idf(model, refs, cands, tokenizer, idf_dict,
|
||||
verbose=verbose, device=device, batch_size=batch_size)
|
||||
|
||||
P = all_preds[:, 0].cpu()
|
||||
R = all_preds[:, 1].cpu()
|
||||
F1 = all_preds[:, 2].cpu()
|
||||
if verbose:
|
||||
print('done in {:.2f} seconds'.format(time.perf_counter() - start))
|
||||
|
||||
return P, R, F1
|
||||
|
||||
def plot_example(h, r, verbose=False, bert="bert-base-multilingual-cased",
|
||||
num_layers=8, fname=''):
|
||||
"""
|
||||
BERTScore metric.
|
||||
Args:
|
||||
- :param: `h` (str): a candidate sentence
|
||||
- :param: `r` (str): a reference sentence
|
||||
- :param: `verbose` (bool): turn on intermediate status update
|
||||
- :param: `bert` (str): bert specification
|
||||
- :param: `num_layers` (int): the layer of representation to use
|
||||
"""
|
||||
assert bert in bert_types
|
||||
|
||||
if verbose:
|
||||
print('loading BERT model...')
|
||||
tokenizer = BertTokenizer.from_pretrained(bert)
|
||||
model = BertModel.from_pretrained(bert)
|
||||
model.eval()
|
||||
device = 'cuda' if torch.cuda.is_available() else 'cpu'
|
||||
model.to(device)
|
||||
|
||||
h_tokens = ['[CLS]'] + tokenizer.tokenize(h) + ['[SEP]']
|
||||
r_tokens = ['[CLS]'] + tokenizer.tokenize(r) + ['[SEP]']
|
||||
|
||||
model.encoder.layer = torch.nn.ModuleList([layer for layer in model.encoder.layer[:num_layers]])
|
||||
idf_dict = defaultdict(lambda: 1.)
|
||||
|
||||
ref_embedding, ref_lens, ref_masks, padded_idf = get_bert_embedding([r], model, tokenizer, idf_dict,
|
||||
device=device)
|
||||
hyp_embedding, ref_lens, ref_masks, padded_idf = get_bert_embedding([h], model, tokenizer, idf_dict,
|
||||
device=device)
|
||||
|
||||
ref_embedding.div_(torch.norm(ref_embedding, dim=-1).unsqueeze(-1))
|
||||
hyp_embedding.div_(torch.norm(hyp_embedding, dim=-1).unsqueeze(-1))
|
||||
|
||||
batch_size = ref_embedding.size(1)
|
||||
|
||||
sim = torch.bmm(hyp_embedding, ref_embedding.transpose(1, 2)).cpu()
|
||||
sim = sim.squeeze(0).numpy()
|
||||
|
||||
# remove [CLS] and [SEP] tokens
|
||||
r_tokens = r_tokens[1:-1]
|
||||
h_tokens = h_tokens[1:-1]
|
||||
sim = sim[1:-1,1:-1]
|
||||
|
||||
fig, ax = plt.subplots(figsize=(len(r_tokens)*0.8, len(h_tokens)*0.8))
|
||||
im = ax.imshow(sim, cmap='Blues')
|
||||
|
||||
# We want to show all ticks...
|
||||
ax.set_xticks(np.arange(len(r_tokens)))
|
||||
ax.set_yticks(np.arange(len(h_tokens)))
|
||||
# ... and label them with the respective list entries
|
||||
ax.set_xticklabels(r_tokens, fontsize=10)
|
||||
ax.set_yticklabels(h_tokens, fontsize=10)
|
||||
plt.xlabel("Refernce", fontsize=10)
|
||||
plt.ylabel("Candidate", fontsize=10)
|
||||
|
||||
# Rotate the tick labels and set their alignment.
|
||||
plt.setp(ax.get_xticklabels(), rotation=45, ha="right",
|
||||
rotation_mode="anchor")
|
||||
|
||||
# Loop over data dimensions and create text annotations.
|
||||
for i in range(len(h_tokens)):
|
||||
for j in range(len(r_tokens)):
|
||||
text = ax.text(j, i, '{:.3f}'.format(sim[i, j]),
|
||||
ha="center", va="center", color="k" if sim[i, j] < 0.6 else "w")
|
||||
|
||||
# P = sim.max(1).mean()
|
||||
# R = sim.max(0).mean()
|
||||
# F1 = 2 * P * R / (P + R)
|
||||
|
||||
fig.tight_layout()
|
||||
# plt.title("BERT-F1: {:.3f}".format(F1), fontsize=10)
|
||||
if fname != "":
|
||||
print("Saved figure to file: ", fname+".png")
|
||||
plt.savefig(fname+'.png', dpi=100)
|
||||
plt.show()
|
||||
@@ -0,0 +1,212 @@
|
||||
import torch
|
||||
from math import log
|
||||
from itertools import chain
|
||||
from collections import defaultdict, Counter
|
||||
from multiprocessing import Pool
|
||||
from functools import partial
|
||||
from tqdm.auto import tqdm
|
||||
|
||||
__all__ = ['bert_types']
|
||||
|
||||
bert_types = [
|
||||
'bert-base-uncased',
|
||||
'bert-large-uncased',
|
||||
'bert-base-cased',
|
||||
'bert-large-cased',
|
||||
'bert-base-multilingual-uncased',
|
||||
'bert-base-multilingual-cased',
|
||||
'bert-base-chinese',
|
||||
]
|
||||
|
||||
def padding(arr, pad_token, dtype=torch.long):
|
||||
lens = torch.LongTensor([len(a) for a in arr])
|
||||
max_len = lens.max().item()
|
||||
padded = torch.ones(len(arr), max_len, dtype=dtype) * pad_token
|
||||
mask = torch.zeros(len(arr), max_len, dtype=torch.long)
|
||||
for i, a in enumerate(arr):
|
||||
padded[i, :lens[i]] = torch.tensor(a, dtype=dtype)
|
||||
mask[i, :lens[i]] = 1
|
||||
return padded, lens, mask
|
||||
|
||||
|
||||
def bert_encode(model, x, attention_mask):
|
||||
model.eval()
|
||||
x_seg = torch.zeros_like(x, dtype=torch.long)
|
||||
with torch.no_grad():
|
||||
x_encoded_layers, pooled_output = model(x, x_seg, attention_mask=attention_mask, output_all_encoded_layers=False)
|
||||
return x_encoded_layers
|
||||
|
||||
|
||||
def process(a, tokenizer=None):
|
||||
if not tokenizer is None:
|
||||
a = ["[CLS]"]+tokenizer.tokenize(a)+["[SEP]"]
|
||||
a = tokenizer.convert_tokens_to_ids(a)
|
||||
return set(a)
|
||||
|
||||
|
||||
def get_idf_dict(arr, tokenizer, nthreads=4):
|
||||
"""
|
||||
Returns mapping from word piece index to its inverse document frequency.
|
||||
Args:
|
||||
- :param: `arr` (list of str) : sentences to process.
|
||||
- :param: `tokenizer` : a BERT tokenizer corresponds to `model`.
|
||||
- :param: `nthreads` (int) : number of CPU threads to use
|
||||
"""
|
||||
idf_count = Counter()
|
||||
num_docs = len(arr)
|
||||
|
||||
process_partial = partial(process, tokenizer=tokenizer)
|
||||
|
||||
with Pool(nthreads) as p:
|
||||
idf_count.update(chain.from_iterable(p.map(process_partial, arr)))
|
||||
|
||||
idf_dict = defaultdict(lambda : log((num_docs+1)/(1)))
|
||||
idf_dict.update({idx:log((num_docs+1)/(c+1)) for (idx, c) in idf_count.items()})
|
||||
return idf_dict
|
||||
|
||||
|
||||
def collate_idf(arr, tokenize, numericalize, idf_dict,
|
||||
pad="[PAD]", device='cuda:0'):
|
||||
"""
|
||||
Helper function that pads a list of sentences to hvae the same length and
|
||||
loads idf score for words in the sentences.
|
||||
Args:
|
||||
- :param: `arr` (list of str): sentences to process.
|
||||
- :param: `tokenize` : a function that takes a string and return list
|
||||
of tokens.
|
||||
- :param: `numericalize` : a function that takes a list of tokens and
|
||||
return list of token indexes.
|
||||
- :param: `idf_dict` (dict): mapping a word piece index to its
|
||||
inverse document frequency
|
||||
- :param: `pad` (str): the padding token.
|
||||
- :param: `device` (str): device to use, e.g. 'cpu' or 'cuda'
|
||||
"""
|
||||
arr = [["[CLS]"]+tokenize(a)+["[SEP]"] for a in arr]
|
||||
arr = [numericalize(a) for a in arr]
|
||||
|
||||
idf_weights = [[idf_dict[i] for i in a] for a in arr]
|
||||
|
||||
pad_token = numericalize([pad])[0]
|
||||
|
||||
padded, lens, mask = padding(arr, pad_token, dtype=torch.long)
|
||||
padded_idf, _, _ = padding(idf_weights, pad_token, dtype=torch.float)
|
||||
|
||||
padded = padded.to(device=device)
|
||||
mask = mask.to(device=device)
|
||||
lens = lens.to(device=device)
|
||||
return padded, padded_idf, lens, mask
|
||||
|
||||
|
||||
def get_bert_embedding(all_sens, model, tokenizer, idf_dict,
|
||||
batch_size=-1, device='cuda:0'):
|
||||
"""
|
||||
Compute BERT embedding in batches.
|
||||
Args:
|
||||
- :param: `all_sens` (list of str) : sentences to encode.
|
||||
- :param: `model` : a BERT model from `pytorch_pretrained_bert`.
|
||||
- :param: `tokenizer` : a BERT tokenizer corresponds to `model`.
|
||||
- :param: `idf_dict` (dict) : mapping a word piece index to its
|
||||
inverse document frequency
|
||||
- :param: `device` (str): device to use, e.g. 'cpu' or 'cuda'
|
||||
"""
|
||||
|
||||
padded_sens, padded_idf, lens, mask = collate_idf(all_sens,
|
||||
tokenizer.tokenize, tokenizer.convert_tokens_to_ids,
|
||||
idf_dict,
|
||||
device=device)
|
||||
|
||||
if batch_size == -1: batch_size = len(all_sens)
|
||||
|
||||
embeddings = []
|
||||
with torch.no_grad():
|
||||
for i in range(0, len(all_sens), batch_size):
|
||||
batch_embedding = bert_encode(model, padded_sens[i:i+batch_size],
|
||||
attention_mask=mask[i:i+batch_size])
|
||||
# batch_embedding = torch.stack(batch_embedding)
|
||||
embeddings.append(batch_embedding)
|
||||
del batch_embedding
|
||||
|
||||
total_embedding = torch.cat(embeddings, dim=0)
|
||||
|
||||
return total_embedding, lens, mask, padded_idf
|
||||
|
||||
|
||||
def greedy_cos_idf(ref_embedding, ref_lens, ref_masks, ref_idf,
|
||||
hyp_embedding, hyp_lens, hyp_masks, hyp_idf):
|
||||
"""
|
||||
Compute greedy matching based on cosine similarity.
|
||||
Args:
|
||||
- :param: `ref_embedding` (torch.Tensor):
|
||||
embeddings of reference sentences, BxKxd,
|
||||
B: batch size, K: longest length, d: bert dimenison
|
||||
- :param: `ref_lens` (list of int): list of reference sentence length.
|
||||
- :param: `ref_masks` (torch.LongTensor): BxKxK, BERT attention mask for
|
||||
reference sentences.
|
||||
- :param: `ref_idf` (torch.Tensor): BxK, idf score of each word
|
||||
piece in the reference setence
|
||||
- :param: `hyp_embedding` (torch.Tensor):
|
||||
embeddings of candidate sentences, BxKxd,
|
||||
B: batch size, K: longest length, d: bert dimenison
|
||||
- :param: `hyp_lens` (list of int): list of candidate sentence length.
|
||||
- :param: `hyp_masks` (torch.LongTensor): BxKxK, BERT attention mask for
|
||||
candidate sentences.
|
||||
- :param: `hyp_idf` (torch.Tensor): BxK, idf score of each word
|
||||
piece in the candidate setence
|
||||
"""
|
||||
|
||||
ref_embedding.div_(torch.norm(ref_embedding, dim=-1).unsqueeze(-1))
|
||||
hyp_embedding.div_(torch.norm(hyp_embedding, dim=-1).unsqueeze(-1))
|
||||
|
||||
batch_size = ref_embedding.size(0)
|
||||
|
||||
sim = torch.bmm(hyp_embedding, ref_embedding.transpose(1, 2))
|
||||
masks = torch.bmm(hyp_masks.unsqueeze(2).float(), ref_masks.unsqueeze(1).float())
|
||||
masks = masks.expand(batch_size, masks.size(1), masks.size(2))\
|
||||
.contiguous().view_as(sim)
|
||||
|
||||
masks = masks.float().to(sim.device)
|
||||
sim = sim * masks
|
||||
|
||||
word_precision = sim.max(dim=2)[0]
|
||||
word_recall = sim.max(dim=1)[0]
|
||||
|
||||
hyp_idf.div_(hyp_idf.sum(dim=1, keepdim=True))
|
||||
ref_idf.div_(ref_idf.sum(dim=1, keepdim=True))
|
||||
precision_scale = hyp_idf.to(word_precision.device)
|
||||
recall_scale = ref_idf.to(word_recall.device)
|
||||
P = (word_precision * precision_scale).sum(dim=1)
|
||||
R = (word_recall * recall_scale).sum(dim=1)
|
||||
|
||||
F = 2 * P * R / (P + R)
|
||||
return P, R, F
|
||||
|
||||
def bert_cos_score_idf(model, refs, hyps, tokenizer, idf_dict,
|
||||
verbose=False, batch_size=64, device='cuda:0'):
|
||||
"""
|
||||
Compute BERTScore.
|
||||
Args:
|
||||
- :param: `model` : a BERT model in `pytorch_pretrained_bert`
|
||||
- :param: `refs` (list of str): reference sentences
|
||||
- :param: `hyps` (list of str): candidate sentences
|
||||
- :param: `tokenzier` : a BERT tokenizer corresponds to `model`
|
||||
- :param: `idf_dict` : a dictionary mapping a word piece index to its
|
||||
inverse document frequency
|
||||
- :param: `verbose` (bool): turn on intermediate status update
|
||||
- :param: `batch_size` (int): bert score processing batch size
|
||||
- :param: `device` (str): device to use, e.g. 'cpu' or 'cuda'
|
||||
"""
|
||||
preds = []
|
||||
iter_range = range(0, len(refs), batch_size)
|
||||
if verbose: iter_range = tqdm(iter_range)
|
||||
for batch_start in iter_range:
|
||||
batch_refs = refs[batch_start:batch_start+batch_size]
|
||||
batch_hyps = hyps[batch_start:batch_start+batch_size]
|
||||
ref_stats = get_bert_embedding(batch_refs, model, tokenizer, idf_dict,
|
||||
device=device)
|
||||
hyp_stats = get_bert_embedding(batch_hyps, model, tokenizer, idf_dict,
|
||||
device=device)
|
||||
|
||||
P, R, F1 = greedy_cos_idf(*ref_stats, *hyp_stats)
|
||||
preds.append(torch.stack((P, R, F1), dim=1).cpu())
|
||||
preds = torch.cat(preds, dim=0)
|
||||
return preds
|
||||
19
KG Reasoning/ATOMIC2020/system_eval/evaluation/bleu/LICENSE
Normal file
19
KG Reasoning/ATOMIC2020/system_eval/evaluation/bleu/LICENSE
Normal file
@@ -0,0 +1,19 @@
|
||||
Copyright (c) 2015 Xinlei Chen, Hao Fang, Tsung-Yi Lin, and Ramakrishna Vedantam
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
||||
@@ -0,0 +1 @@
|
||||
__author__ = 'tylin'
|
||||
47
KG Reasoning/ATOMIC2020/system_eval/evaluation/bleu/bleu.py
Normal file
47
KG Reasoning/ATOMIC2020/system_eval/evaluation/bleu/bleu.py
Normal file
@@ -0,0 +1,47 @@
|
||||
#!/usr/bin/env python
|
||||
#
|
||||
# File Name : bleu.py
|
||||
#
|
||||
# Description : Wrapper for BLEU scorer.
|
||||
#
|
||||
# Creation Date : 06-01-2015
|
||||
# Last Modified : Thu 19 Mar 2015 09:13:28 PM PDT
|
||||
# Authors : Hao Fang <hfang@uw.edu> and Tsung-Yi Lin <tl483@cornell.edu>
|
||||
|
||||
from evaluation.bleu.bleu_scorer import BleuScorer
|
||||
|
||||
|
||||
class Bleu:
|
||||
def __init__(self, n=4):
|
||||
# default compute Blue score up to 4
|
||||
self._n = n
|
||||
self._hypo_for_image = {}
|
||||
self.ref_for_image = {}
|
||||
|
||||
def compute_score(self, gts, res):
|
||||
|
||||
assert(gts.keys() == res.keys())
|
||||
imgIds = gts.keys()
|
||||
|
||||
bleu_scorer = BleuScorer(n=self._n)
|
||||
for id in imgIds:
|
||||
hypo = res[id]
|
||||
ref = gts[id]
|
||||
|
||||
# Sanity check.
|
||||
assert(type(hypo) is list)
|
||||
assert(len(hypo) == 1)
|
||||
assert(type(ref) is list)
|
||||
assert(len(ref) >= 1)
|
||||
|
||||
bleu_scorer += (hypo[0], ref)
|
||||
|
||||
#score, scores = bleu_scorer.compute_score(option='shortest')
|
||||
score, scores = bleu_scorer.compute_score(option='closest', verbose=0)
|
||||
#score, scores = bleu_scorer.compute_score(option='average', verbose=0)
|
||||
|
||||
# return (bleu, bleu_info)
|
||||
return score, scores
|
||||
|
||||
def method(self):
|
||||
return "Bleu"
|
||||
@@ -0,0 +1,264 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
# bleu_scorer.py
|
||||
# David Chiang <chiang@isi.edu>
|
||||
|
||||
# Copyright (c) 2004-2006 University of Maryland. All rights
|
||||
# reserved. Do not redistribute without permission from the
|
||||
# author. Not for commercial use.
|
||||
|
||||
# Modified by:
|
||||
# Hao Fang <hfang@uw.edu>
|
||||
# Tsung-Yi Lin <tl483@cornell.edu>
|
||||
|
||||
'''Provides:
|
||||
cook_refs(refs, n=4): Transform a list of reference sentences as strings into a form usable by cook_test().
|
||||
cook_test(test, refs, n=4): Transform a test sentence as a string (together with the cooked reference sentences) into a form usable by score_cooked().
|
||||
'''
|
||||
|
||||
import copy
|
||||
import sys, math, re
|
||||
from collections import defaultdict
|
||||
|
||||
def precook(s, n=4, out=False):
|
||||
"""Takes a string as input and returns an object that can be given to
|
||||
either cook_refs or cook_test. This is optional: cook_refs and cook_test
|
||||
can take string arguments as well."""
|
||||
words = s.split()
|
||||
counts = defaultdict(int)
|
||||
for k in range(1,n+1):
|
||||
for i in range(len(words)-k+1):
|
||||
ngram = tuple(words[i:i+k])
|
||||
counts[ngram] += 1
|
||||
return (len(words), counts)
|
||||
|
||||
def cook_refs(refs, eff=None, n=4): ## lhuang: oracle will call with "average"
|
||||
'''Takes a list of reference sentences for a single segment
|
||||
and returns an object that encapsulates everything that BLEU
|
||||
needs to know about them.'''
|
||||
|
||||
reflen = []
|
||||
maxcounts = {}
|
||||
for ref in refs:
|
||||
rl, counts = precook(ref, n)
|
||||
reflen.append(rl)
|
||||
for (ngram,count) in counts.items():
|
||||
maxcounts[ngram] = max(maxcounts.get(ngram,0), count)
|
||||
|
||||
# Calculate effective reference sentence length.
|
||||
if eff == "shortest":
|
||||
reflen = min(reflen)
|
||||
elif eff == "average":
|
||||
reflen = float(sum(reflen))/len(reflen)
|
||||
|
||||
## lhuang: N.B.: leave reflen computaiton to the very end!!
|
||||
|
||||
## lhuang: N.B.: in case of "closest", keep a list of reflens!! (bad design)
|
||||
|
||||
return (reflen, maxcounts)
|
||||
|
||||
def cook_test(test, tup, eff=None, n=4):
|
||||
'''Takes a test sentence and returns an object that
|
||||
encapsulates everything that BLEU needs to know about it.'''
|
||||
|
||||
(reflen, refmaxcounts) = tup
|
||||
testlen, counts = precook(test, n, True)
|
||||
|
||||
result = {}
|
||||
|
||||
# Calculate effective reference sentence length.
|
||||
|
||||
if eff == "closest":
|
||||
result["reflen"] = min((abs(l-testlen), l) for l in reflen)[1]
|
||||
else: ## i.e., "average" or "shortest" or None
|
||||
result["reflen"] = reflen
|
||||
|
||||
result["testlen"] = testlen
|
||||
|
||||
result["guess"] = [max(0,testlen-k+1) for k in range(1,n+1)]
|
||||
|
||||
result['correct'] = [0]*n
|
||||
for (ngram, count) in counts.items():
|
||||
result["correct"][len(ngram)-1] += min(refmaxcounts.get(ngram,0), count)
|
||||
|
||||
return result
|
||||
|
||||
class BleuScorer(object):
|
||||
"""Bleu scorer.
|
||||
"""
|
||||
|
||||
__slots__ = "n", "crefs", "ctest", "_score", "_ratio", "_testlen", "_reflen", "special_reflen"
|
||||
# special_reflen is used in oracle (proportional effective ref len for a node).
|
||||
|
||||
def copy(self):
|
||||
''' copy the refs.'''
|
||||
new = BleuScorer(n=self.n)
|
||||
new.ctest = copy.copy(self.ctest)
|
||||
new.crefs = copy.copy(self.crefs)
|
||||
new._score = None
|
||||
return new
|
||||
|
||||
def __init__(self, test=None, refs=None, n=4, special_reflen=None):
|
||||
''' singular instance '''
|
||||
|
||||
self.n = n
|
||||
self.crefs = []
|
||||
self.ctest = []
|
||||
self.cook_append(test, refs)
|
||||
self.special_reflen = special_reflen
|
||||
|
||||
def cook_append(self, test, refs):
|
||||
'''called by constructor and __iadd__ to avoid creating new instances.'''
|
||||
|
||||
if refs is not None:
|
||||
self.crefs.append(cook_refs(refs))
|
||||
if test is not None:
|
||||
cooked_test = cook_test(test, self.crefs[-1])
|
||||
self.ctest.append(cooked_test) ## N.B.: -1
|
||||
else:
|
||||
self.ctest.append(None) # lens of crefs and ctest have to match
|
||||
|
||||
self._score = None ## need to recompute
|
||||
|
||||
def ratio(self, option=None):
|
||||
self.compute_score(option=option)
|
||||
return self._ratio
|
||||
|
||||
def score_ratio(self, option=None):
|
||||
'''return (bleu, len_ratio) pair'''
|
||||
return (self.fscore(option=option), self.ratio(option=option))
|
||||
|
||||
def score_ratio_str(self, option=None):
|
||||
return "%.4f (%.2f)" % self.score_ratio(option)
|
||||
|
||||
def reflen(self, option=None):
|
||||
self.compute_score(option=option)
|
||||
return self._reflen
|
||||
|
||||
def testlen(self, option=None):
|
||||
self.compute_score(option=option)
|
||||
return self._testlen
|
||||
|
||||
def retest(self, new_test):
|
||||
if type(new_test) is str:
|
||||
new_test = [new_test]
|
||||
assert len(new_test) == len(self.crefs), new_test
|
||||
self.ctest = []
|
||||
for t, rs in zip(new_test, self.crefs):
|
||||
self.ctest.append(cook_test(t, rs))
|
||||
self._score = None
|
||||
|
||||
return self
|
||||
|
||||
def rescore(self, new_test):
|
||||
''' replace test(s) with new test(s), and returns the new score.'''
|
||||
|
||||
return self.retest(new_test).compute_score()
|
||||
|
||||
def size(self):
|
||||
assert len(self.crefs) == len(self.ctest), "refs/test mismatch! %d<>%d" % (len(self.crefs), len(self.ctest))
|
||||
return len(self.crefs)
|
||||
|
||||
def __iadd__(self, other):
|
||||
'''add an instance (e.g., from another sentence).'''
|
||||
|
||||
if type(other) is tuple:
|
||||
## avoid creating new BleuScorer instances
|
||||
self.cook_append(other[0], other[1])
|
||||
else:
|
||||
assert self.compatible(other), "incompatible BLEUs."
|
||||
self.ctest.extend(other.ctest)
|
||||
self.crefs.extend(other.crefs)
|
||||
self._score = None ## need to recompute
|
||||
|
||||
return self
|
||||
|
||||
def compatible(self, other):
|
||||
return isinstance(other, BleuScorer) and self.n == other.n
|
||||
|
||||
def single_reflen(self, option="average"):
|
||||
return self._single_reflen(self.crefs[0][0], option)
|
||||
|
||||
def _single_reflen(self, reflens, option=None, testlen=None):
|
||||
|
||||
if option == "shortest":
|
||||
reflen = min(reflens)
|
||||
elif option == "average":
|
||||
reflen = float(sum(reflens))/len(reflens)
|
||||
elif option == "closest":
|
||||
reflen = min((abs(l-testlen), l) for l in reflens)[1]
|
||||
else:
|
||||
assert False, "unsupported reflen option %s" % option
|
||||
|
||||
return reflen
|
||||
|
||||
def recompute_score(self, option=None, verbose=0):
|
||||
self._score = None
|
||||
return self.compute_score(option, verbose)
|
||||
|
||||
def compute_score(self, option=None, verbose=0):
|
||||
n = self.n
|
||||
small = 1e-9
|
||||
tiny = 1e-15 ## so that if guess is 0 still return 0
|
||||
bleu_list = [[] for _ in range(n)]
|
||||
|
||||
if self._score is not None:
|
||||
return self._score
|
||||
|
||||
if option is None:
|
||||
option = "average" if len(self.crefs) == 1 else "closest"
|
||||
|
||||
self._testlen = 0
|
||||
self._reflen = 0
|
||||
totalcomps = {'testlen':0, 'reflen':0, 'guess':[0]*n, 'correct':[0]*n}
|
||||
|
||||
# for each sentence
|
||||
for comps in self.ctest:
|
||||
testlen = comps['testlen']
|
||||
self._testlen += testlen
|
||||
|
||||
if self.special_reflen is None: ## need computation
|
||||
reflen = self._single_reflen(comps['reflen'], option, testlen)
|
||||
else:
|
||||
reflen = self.special_reflen
|
||||
|
||||
self._reflen += reflen
|
||||
|
||||
for key in ['guess','correct']:
|
||||
for k in range(n):
|
||||
totalcomps[key][k] += comps[key][k]
|
||||
|
||||
# append per image bleu score
|
||||
bleu = 1.
|
||||
for k in range(n):
|
||||
bleu *= (float(comps['correct'][k]) + tiny) \
|
||||
/(float(comps['guess'][k]) + small)
|
||||
bleu_list[k].append(bleu ** (1./(k+1)))
|
||||
ratio = (testlen + tiny) / (reflen + small) ## N.B.: avoid zero division
|
||||
if ratio < 1:
|
||||
for k in range(n):
|
||||
bleu_list[k][-1] *= math.exp(1 - 1/ratio)
|
||||
|
||||
if verbose > 1:
|
||||
print(comps, reflen)
|
||||
|
||||
totalcomps['reflen'] = self._reflen
|
||||
totalcomps['testlen'] = self._testlen
|
||||
|
||||
bleus = []
|
||||
bleu = 1.
|
||||
for k in range(n):
|
||||
bleu *= float(totalcomps['correct'][k] + tiny) \
|
||||
/ (totalcomps['guess'][k] + small)
|
||||
bleus.append(bleu ** (1./(k+1)))
|
||||
ratio = (self._testlen + tiny) / (self._reflen + small) ## N.B.: avoid zero division
|
||||
if ratio < 1:
|
||||
for k in range(n):
|
||||
bleus[k] *= math.exp(1 - 1/ratio)
|
||||
|
||||
if verbose > 0:
|
||||
print(totalcomps)
|
||||
print("ratio:", ratio)
|
||||
|
||||
self._score = bleus
|
||||
return self._score, bleu_list
|
||||
@@ -0,0 +1 @@
|
||||
__author__ = 'tylin'
|
||||
@@ -0,0 +1,54 @@
|
||||
# Filename: cider.py
|
||||
#
|
||||
# Description: Describes the class to compute the CIDEr (Consensus-Based Image Description Evaluation) Metric
|
||||
# by Vedantam, Zitnick, and Parikh (http://arxiv.org/abs/1411.5726)
|
||||
#
|
||||
# Creation Date: Sun Feb 8 14:16:54 2015
|
||||
#
|
||||
# Authors: Ramakrishna Vedantam <vrama91@vt.edu> and Tsung-Yi Lin <tl483@cornell.edu>
|
||||
|
||||
from evaluation.cider.cider_scorer import CiderScorer
|
||||
import pdb
|
||||
|
||||
class Cider:
|
||||
"""
|
||||
Main Class to compute the CIDEr metric
|
||||
|
||||
"""
|
||||
def __init__(self, test=None, refs=None, n=4, sigma=6.0):
|
||||
# set cider to sum over 1 to 4-grams
|
||||
self._n = n
|
||||
# set the standard deviation parameter for gaussian penalty
|
||||
self._sigma = sigma
|
||||
|
||||
def compute_score(self, gts, res):
|
||||
"""
|
||||
Main function to compute CIDEr score
|
||||
:param hypo_for_image (dict) : dictionary with key <image> and value <tokenized hypothesis / candidate sentence>
|
||||
ref_for_image (dict) : dictionary with key <image> and value <tokenized reference sentence>
|
||||
:return: cider (float) : computed CIDEr score for the corpus
|
||||
"""
|
||||
|
||||
assert(gts.keys() == res.keys())
|
||||
imgIds = gts.keys()
|
||||
|
||||
cider_scorer = CiderScorer(n=self._n, sigma=self._sigma)
|
||||
|
||||
for id in imgIds:
|
||||
hypo = res[id]
|
||||
ref = gts[id]
|
||||
|
||||
# Sanity check.
|
||||
assert(type(hypo) is list)
|
||||
assert(len(hypo) == 1)
|
||||
assert(type(ref) is list)
|
||||
assert(len(ref) > 0)
|
||||
|
||||
cider_scorer += (hypo[0], ref)
|
||||
|
||||
(score, scores) = cider_scorer.compute_score()
|
||||
|
||||
return score, scores
|
||||
|
||||
def method(self):
|
||||
return "CIDEr"
|
||||
@@ -0,0 +1,192 @@
|
||||
#!/usr/bin/env python
|
||||
# Tsung-Yi Lin <tl483@cornell.edu>
|
||||
# Ramakrishna Vedantam <vrama91@vt.edu>
|
||||
|
||||
import copy
|
||||
from collections import defaultdict
|
||||
import numpy as np
|
||||
import pdb
|
||||
import math
|
||||
|
||||
def precook(s, n=4, out=False):
|
||||
"""
|
||||
Takes a string as input and returns an object that can be given to
|
||||
either cook_refs or cook_test. This is optional: cook_refs and cook_test
|
||||
can take string arguments as well.
|
||||
:param s: string : sentence to be converted into ngrams
|
||||
:param n: int : number of ngrams for which representation is calculated
|
||||
:return: term frequency vector for occuring ngrams
|
||||
"""
|
||||
words = s.split()
|
||||
counts = defaultdict(int)
|
||||
for k in range(1,n+1):
|
||||
for i in range(len(words)-k+1):
|
||||
ngram = tuple(words[i:i+k])
|
||||
counts[ngram] += 1
|
||||
return counts
|
||||
|
||||
def cook_refs(refs, n=4): ## lhuang: oracle will call with "average"
|
||||
'''Takes a list of reference sentences for a single segment
|
||||
and returns an object that encapsulates everything that BLEU
|
||||
needs to know about them.
|
||||
:param refs: list of string : reference sentences for some image
|
||||
:param n: int : number of ngrams for which (ngram) representation is calculated
|
||||
:return: result (list of dict)
|
||||
'''
|
||||
return [precook(ref, n) for ref in refs]
|
||||
|
||||
def cook_test(test, n=4):
|
||||
'''Takes a test sentence and returns an object that
|
||||
encapsulates everything that BLEU needs to know about it.
|
||||
:param test: list of string : hypothesis sentence for some image
|
||||
:param n: int : number of ngrams for which (ngram) representation is calculated
|
||||
:return: result (dict)
|
||||
'''
|
||||
return precook(test, n, True)
|
||||
|
||||
class CiderScorer(object):
|
||||
"""CIDEr scorer.
|
||||
"""
|
||||
|
||||
def copy(self):
|
||||
''' copy the refs.'''
|
||||
new = CiderScorer(n=self.n)
|
||||
new.ctest = copy.copy(self.ctest)
|
||||
new.crefs = copy.copy(self.crefs)
|
||||
return new
|
||||
|
||||
def __init__(self, test=None, refs=None, n=4, sigma=6.0):
|
||||
''' singular instance '''
|
||||
self.n = n
|
||||
self.sigma = sigma
|
||||
self.crefs = []
|
||||
self.ctest = []
|
||||
self.document_frequency = defaultdict(float)
|
||||
self.cook_append(test, refs)
|
||||
self.ref_len = None
|
||||
|
||||
def cook_append(self, test, refs):
|
||||
'''called by constructor and __iadd__ to avoid creating new instances.'''
|
||||
|
||||
if refs is not None:
|
||||
self.crefs.append(cook_refs(refs))
|
||||
if test is not None:
|
||||
self.ctest.append(cook_test(test)) ## N.B.: -1
|
||||
else:
|
||||
self.ctest.append(None) # lens of crefs and ctest have to match
|
||||
|
||||
def size(self):
|
||||
assert len(self.crefs) == len(self.ctest), "refs/test mismatch! %d<>%d" % (len(self.crefs), len(self.ctest))
|
||||
return len(self.crefs)
|
||||
|
||||
def __iadd__(self, other):
|
||||
'''add an instance (e.g., from another sentence).'''
|
||||
|
||||
if type(other) is tuple:
|
||||
## avoid creating new CiderScorer instances
|
||||
self.cook_append(other[0], other[1])
|
||||
else:
|
||||
self.ctest.extend(other.ctest)
|
||||
self.crefs.extend(other.crefs)
|
||||
|
||||
return self
|
||||
def compute_doc_freq(self):
|
||||
'''
|
||||
Compute term frequency for reference data.
|
||||
This will be used to compute idf (inverse document frequency later)
|
||||
The term frequency is stored in the object
|
||||
:return: None
|
||||
'''
|
||||
for refs in self.crefs:
|
||||
# refs, k ref captions of one image
|
||||
for ngram in set([ngram for ref in refs for (ngram,count) in ref.items()]):
|
||||
self.document_frequency[ngram] += 1
|
||||
# maxcounts[ngram] = max(maxcounts.get(ngram,0), count)
|
||||
|
||||
def compute_cider(self):
|
||||
def counts2vec(cnts):
|
||||
"""
|
||||
Function maps counts of ngram to vector of tfidf weights.
|
||||
The function returns vec, an array of dictionary that store mapping of n-gram and tf-idf weights.
|
||||
The n-th entry of array denotes length of n-grams.
|
||||
:param cnts:
|
||||
:return: vec (array of dict), norm (array of float), length (int)
|
||||
"""
|
||||
vec = [defaultdict(float) for _ in range(self.n)]
|
||||
length = 0
|
||||
norm = [0.0 for _ in range(self.n)]
|
||||
for (ngram,term_freq) in cnts.items():
|
||||
# give word count 1 if it doesn't appear in reference corpus
|
||||
df = np.log(max(1.0, self.document_frequency[ngram]))
|
||||
# ngram index
|
||||
n = len(ngram)-1
|
||||
# tf (term_freq) * idf (precomputed idf) for n-grams
|
||||
vec[n][ngram] = float(term_freq)*(self.ref_len - df)
|
||||
# compute norm for the vector. the norm will be used for computing similarity
|
||||
norm[n] += pow(vec[n][ngram], 2)
|
||||
|
||||
if n == 1:
|
||||
length += term_freq
|
||||
norm = [np.sqrt(n) for n in norm]
|
||||
return vec, norm, length
|
||||
|
||||
def sim(vec_hyp, vec_ref, norm_hyp, norm_ref, length_hyp, length_ref):
|
||||
'''
|
||||
Compute the cosine similarity of two vectors.
|
||||
:param vec_hyp: array of dictionary for vector corresponding to hypothesis
|
||||
:param vec_ref: array of dictionary for vector corresponding to reference
|
||||
:param norm_hyp: array of float for vector corresponding to hypothesis
|
||||
:param norm_ref: array of float for vector corresponding to reference
|
||||
:param length_hyp: int containing length of hypothesis
|
||||
:param length_ref: int containing length of reference
|
||||
:return: array of score for each n-grams cosine similarity
|
||||
'''
|
||||
delta = float(length_hyp - length_ref)
|
||||
# measure consine similarity
|
||||
val = np.array([0.0 for _ in range(self.n)])
|
||||
for n in range(self.n):
|
||||
# ngram
|
||||
for (ngram,count) in vec_hyp[n].items():
|
||||
# vrama91 : added clipping
|
||||
val[n] += min(vec_hyp[n][ngram], vec_ref[n][ngram]) * vec_ref[n][ngram]
|
||||
|
||||
if (norm_hyp[n] != 0) and (norm_ref[n] != 0):
|
||||
val[n] /= (norm_hyp[n]*norm_ref[n])
|
||||
|
||||
assert(not math.isnan(val[n]))
|
||||
# vrama91: added a length based gaussian penalty
|
||||
val[n] *= np.e**(-(delta**2)/(2*self.sigma**2))
|
||||
return val
|
||||
|
||||
# compute log reference length
|
||||
self.ref_len = np.log(float(len(self.crefs)))
|
||||
|
||||
scores = []
|
||||
for test, refs in zip(self.ctest, self.crefs):
|
||||
# compute vector for test captions
|
||||
vec, norm, length = counts2vec(test)
|
||||
# compute vector for ref captions
|
||||
score = np.array([0.0 for _ in range(self.n)])
|
||||
for ref in refs:
|
||||
vec_ref, norm_ref, length_ref = counts2vec(ref)
|
||||
score += sim(vec, vec_ref, norm, norm_ref, length, length_ref)
|
||||
# change by vrama91 - mean of ngram scores, instead of sum
|
||||
score_avg = np.mean(score)
|
||||
# divide by number of references
|
||||
score_avg /= len(refs)
|
||||
# multiply score by 10
|
||||
score_avg *= 10.0
|
||||
# append score of an image to the score list
|
||||
scores.append(score_avg)
|
||||
return scores
|
||||
|
||||
def compute_score(self, option=None, verbose=0):
|
||||
# compute idf
|
||||
self.compute_doc_freq()
|
||||
# assert to check document frequency
|
||||
assert(len(self.ctest) >= max(self.document_frequency.values()))
|
||||
# compute cider score
|
||||
score = self.compute_cider()
|
||||
# debug
|
||||
# print score
|
||||
return np.mean(np.array(score)), np.array(score)
|
||||
139
KG Reasoning/ATOMIC2020/system_eval/evaluation/eval.py
Normal file
139
KG Reasoning/ATOMIC2020/system_eval/evaluation/eval.py
Normal file
@@ -0,0 +1,139 @@
|
||||
from evaluation.bleu.bleu import Bleu
|
||||
from evaluation.meteor.meteor_nltk import Meteor
|
||||
from evaluation.rouge.rouge import Rouge
|
||||
from evaluation.cider.cider import Cider
|
||||
# from evaluation.bert_score.bert_score import BertScore
|
||||
from collections import defaultdict
|
||||
from argparse import ArgumentParser
|
||||
|
||||
import sys
|
||||
import json
|
||||
#reload(sys)
|
||||
#sys.setdefaultencoding('utf-8')
|
||||
|
||||
class QGEvalCap:
|
||||
def __init__(self, model_key, gts, res, results_file=None):
|
||||
self.gts = gts
|
||||
self.res = res
|
||||
self.results_file = results_file
|
||||
self.model_key = model_key
|
||||
|
||||
def evaluate(self):
|
||||
output = []
|
||||
scorers = [
|
||||
(Bleu(1), ["Bleu_1"]),
|
||||
]
|
||||
|
||||
# =================================================
|
||||
# Compute scores
|
||||
# =================================================
|
||||
score_dict = {}
|
||||
scores_dict = {}
|
||||
#scores_dict["model_key"] = self.model_key
|
||||
for scorer, method in scorers:
|
||||
# print 'computing %s score...'%(scorer.method())
|
||||
score, scores = scorer.compute_score(self.gts, self.res)
|
||||
if type(method) == list:
|
||||
for sc, scs, m in zip(score, scores, method):
|
||||
#print("%s: %0.5f"%(m, sc))
|
||||
output.append(sc)
|
||||
score_dict[m] = str(sc)
|
||||
scores_dict[m] = list(scs)
|
||||
else:
|
||||
#print("%s: %0.5f"%(method, score))
|
||||
output.append(score)
|
||||
score_dict[method] = score
|
||||
scores_dict[method] = list(scores)
|
||||
|
||||
if self.results_file != None:
|
||||
with open(self.results_file, "a") as f:
|
||||
f.write(json.dumps(score_dict)+"\n")
|
||||
|
||||
return score_dict, scores_dict
|
||||
|
||||
def eval(model_key, sources, references, predictions, results_file=None):
|
||||
"""
|
||||
Given a filename, calculate the metric scores for that prediction file
|
||||
isDin: boolean value to check whether input file is DirectIn.txt
|
||||
"""
|
||||
|
||||
pairs = []
|
||||
|
||||
for tup in sources:
|
||||
pair = {}
|
||||
pair['tokenized_sentence'] = tup
|
||||
pairs.append(pair)
|
||||
|
||||
cnt = 0
|
||||
for line in references:
|
||||
pairs[cnt]['tokenized_question'] = line
|
||||
cnt += 1
|
||||
|
||||
output = predictions
|
||||
|
||||
for idx, pair in enumerate(pairs):
|
||||
pair['prediction'] = output[idx]
|
||||
|
||||
## eval
|
||||
from evaluation.eval import QGEvalCap
|
||||
import json
|
||||
from json import encoder
|
||||
encoder.FLOAT_REPR = lambda o: format(o, '.4f')
|
||||
|
||||
res = defaultdict(lambda: [])
|
||||
gts = defaultdict(lambda: [])
|
||||
for pair in pairs[:]:
|
||||
key = pair['tokenized_sentence']
|
||||
#res[key] = [pair['prediction']]
|
||||
res[key] = pair['prediction']
|
||||
|
||||
## gts
|
||||
gts[key].append(pair['tokenized_question'])
|
||||
|
||||
QGEval = QGEvalCap(model_key, gts, res, results_file)
|
||||
return QGEval.evaluate()
|
||||
|
||||
|
||||
def preprocess(file_name, keys):
|
||||
with open(file_name) as f:
|
||||
data = f.readlines()
|
||||
generations = [json.loads(elem) for elem in data]
|
||||
|
||||
predictions = {}
|
||||
references = {}
|
||||
sources = {}
|
||||
keys_list = keys if keys!=None else generations["generations"]
|
||||
for key in keys_list:
|
||||
references[key] = []
|
||||
predictions[key] = []
|
||||
sources[key] = []
|
||||
|
||||
for elem in generations:
|
||||
label = elem["label"]
|
||||
hyp = elem["hyp"+label]
|
||||
for key in keys_list:
|
||||
if key in elem["generations"]:
|
||||
references[key].append(hyp)
|
||||
predictions[key].append(elem["generations"][key])
|
||||
sources[key].append((elem["obs1"], elem["obs2"]))
|
||||
|
||||
return sources, references, predictions
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
parser = ArgumentParser()
|
||||
parser.add_argument("-gen_file", "--gen_file", dest="gen_file", help="generations file with gold/references")
|
||||
parser.add_argument("--keys", type=str, default=None, help="comma-separated list of model keys")
|
||||
parser.add_argument("--results_file", default="eval_results.jsonl")
|
||||
args = parser.parse_args()
|
||||
|
||||
print("scores: \n")
|
||||
keys=None
|
||||
if args.keys:
|
||||
keys = args.keys.split(",")
|
||||
|
||||
sources, references, predictions = preprocess(args.gen_file, keys)
|
||||
for key in references.keys():
|
||||
print("\nEvaluating %s" %key)
|
||||
eval(key, sources[key], references[key], predictions[key], args.results_file)
|
||||
|
||||
@@ -0,0 +1 @@
|
||||
__author__ = 'tylin'
|
||||
Binary file not shown.
@@ -0,0 +1,91 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
# Python wrapper for METEOR implementation, by Xinlei Chen
|
||||
# Acknowledge Michael Denkowski for the generous discussion and help
|
||||
|
||||
import os
|
||||
import sys
|
||||
import subprocess
|
||||
import threading
|
||||
|
||||
# Assumes meteor-1.5.jar is in the same directory as meteor.py. Change as needed.
|
||||
METEOR_JAR = 'meteor-1.5.jar'
|
||||
# print METEOR_JAR
|
||||
|
||||
class Meteor:
|
||||
|
||||
def __init__(self):
|
||||
self.meteor_cmd = ['java', '-jar', '-Xmx2G', METEOR_JAR, \
|
||||
'-', '-', '-stdio', '-l', 'en',
|
||||
'-norm',
|
||||
# '-t', 'adq'
|
||||
# '-p', '0.85 0.2 0.6 0.75' # alpha beta gamma delta'',
|
||||
# '-a', 'data/paraphrase-en.gz', '-m', 'exact stem paraphrase']
|
||||
]
|
||||
self.meteor_p = subprocess.Popen(self.meteor_cmd, \
|
||||
cwd=os.path.dirname(os.path.abspath(__file__)), \
|
||||
stdin=subprocess.PIPE, \
|
||||
stdout=subprocess.PIPE, \
|
||||
stderr=subprocess.PIPE)
|
||||
# Used to guarantee thread safety
|
||||
self.lock = threading.Lock()
|
||||
|
||||
def compute_score(self, gts, res):
|
||||
assert(gts.keys() == res.keys())
|
||||
imgIds = gts.keys()
|
||||
scores = []
|
||||
|
||||
eval_line = 'EVAL'
|
||||
self.lock.acquire()
|
||||
for i in imgIds:
|
||||
assert(len(res[i]) == 1)
|
||||
stat = self._stat(res[i][0], gts[i])
|
||||
eval_line += ' ||| {}'.format(stat)
|
||||
|
||||
print('{}\n'.format(eval_line))
|
||||
self.meteor_p.stdin.write('{}\n'.format(eval_line))
|
||||
print(self.meteor_p.stdout.readline().strip())
|
||||
|
||||
for i in range(0,len(imgIds)):
|
||||
scores.append(float(self.meteor_p.stdout.readline().strip()))
|
||||
score = float(self.meteor_p.stdout.readline().strip())
|
||||
self.lock.release()
|
||||
|
||||
return score, scores
|
||||
|
||||
def method(self):
|
||||
return "METEOR"
|
||||
|
||||
def _stat(self, hypothesis_str, reference_list):
|
||||
# SCORE ||| reference 1 words ||| reference n words ||| hypothesis words
|
||||
hypothesis_str = hypothesis_str.replace('|||','').replace(' ',' ')
|
||||
score_line = ' ||| '.join(('SCORE', ' ||| '.join(reference_list), hypothesis_str))
|
||||
# print score_line
|
||||
str_in = '{}\n'.format(score_line)
|
||||
#self.meteor_p.communicate(str_in.encode('utf=8'))
|
||||
self.meteor_p.stdin.write(str_in.encode('utf=8'))
|
||||
return self.meteor_p.stdout.readline().strip()
|
||||
|
||||
def _score(self, hypothesis_str, reference_list):
|
||||
self.lock.acquire()
|
||||
# SCORE ||| reference 1 words ||| reference n words ||| hypothesis words
|
||||
hypothesis_str = hypothesis_str.replace('|||','').replace(' ',' ')
|
||||
score_line = ' ||| '.join(('SCORE', ' ||| '.join(reference_list), hypothesis_str))
|
||||
self.meteor_p.stdin.write('{}\n'.format(score_line))
|
||||
stats = self.meteor_p.stdout.readline().strip()
|
||||
eval_line = 'EVAL ||| {}'.format(stats)
|
||||
# EVAL ||| stats
|
||||
self.meteor_p.stdin.write('{}\n'.format(eval_line))
|
||||
score = float(self.meteor_p.stdout.readline().strip())
|
||||
# bug fix: there are two values returned by the jar file, one average, and one all, so do it twice
|
||||
# thanks for Andrej for pointing this out
|
||||
score = float(self.meteor_p.stdout.readline().strip())
|
||||
self.lock.release()
|
||||
return score
|
||||
|
||||
def __del__(self):
|
||||
self.lock.acquire()
|
||||
self.meteor_p.stdin.close()
|
||||
self.meteor_p.kill()
|
||||
self.meteor_p.wait()
|
||||
self.lock.release()
|
||||
@@ -0,0 +1,42 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
# Python wrapper for METEOR implementation, by Xinlei Chen
|
||||
# Acknowledge Michael Denkowski for the generous discussion and help
|
||||
|
||||
import os
|
||||
import sys
|
||||
import nltk
|
||||
from nltk.translate.meteor_score import meteor_score
|
||||
|
||||
# Assumes meteor-1.5.jar is in the same directory as meteor.py. Change as needed.
|
||||
#METEOR_JAR = 'meteor-1.5.jar'
|
||||
# print METEOR_JAR
|
||||
|
||||
class Meteor:
|
||||
|
||||
def __init__(self):
|
||||
pass
|
||||
|
||||
def compute_score(self, gts, res):
|
||||
assert(gts.keys() == res.keys())
|
||||
imgIds = gts.keys()
|
||||
scores = []
|
||||
|
||||
for i in imgIds:
|
||||
assert(len(res[i]) == 1)
|
||||
score = round(meteor_score(gts[i], res[i][0]), 4)
|
||||
scores.append(score)
|
||||
#print('{}\n'.format(eval_line))
|
||||
#self.meteor_p.stdin.write('{}\n'.format(eval_line))
|
||||
#print(self.meteor_p.stdout.readline().strip())
|
||||
|
||||
#for i in range(0,len(imgIds)):
|
||||
# scores.append(float(self.meteor_p.stdout.readline().strip()))
|
||||
#score = float(self.meteor_p.stdout.readline().strip())
|
||||
#self.lock.release()
|
||||
|
||||
return sum(scores)/len(scores), scores
|
||||
|
||||
def method(self):
|
||||
return "METEOR"
|
||||
|
||||
@@ -0,0 +1 @@
|
||||
__author__ = 'vrama91'
|
||||
106
KG Reasoning/ATOMIC2020/system_eval/evaluation/rouge/rouge.py
Normal file
106
KG Reasoning/ATOMIC2020/system_eval/evaluation/rouge/rouge.py
Normal file
@@ -0,0 +1,106 @@
|
||||
#!/usr/bin/env python
|
||||
#
|
||||
# File Name : rouge.py
|
||||
#
|
||||
# Description : Computes ROUGE-L metric as described by Lin and Hovey (2004)
|
||||
#
|
||||
# Creation Date : 2015-01-07 06:03
|
||||
# Author : Ramakrishna Vedantam <vrama91@vt.edu>
|
||||
|
||||
import numpy as np
|
||||
import pdb
|
||||
|
||||
def my_lcs(string, sub):
|
||||
"""
|
||||
Calculates longest common subsequence for a pair of tokenized strings
|
||||
:param string : list of str : tokens from a string split using whitespace
|
||||
:param sub : list of str : shorter string, also split using whitespace
|
||||
:returns: length (list of int): length of the longest common subsequence between the two strings
|
||||
|
||||
Note: my_lcs only gives length of the longest common subsequence, not the actual LCS
|
||||
"""
|
||||
if(len(string)< len(sub)):
|
||||
sub, string = string, sub
|
||||
|
||||
lengths = [[0 for i in range(0,len(sub)+1)] for j in range(0,len(string)+1)]
|
||||
|
||||
for j in range(1,len(sub)+1):
|
||||
for i in range(1,len(string)+1):
|
||||
if(string[i-1] == sub[j-1]):
|
||||
lengths[i][j] = lengths[i-1][j-1] + 1
|
||||
else:
|
||||
lengths[i][j] = max(lengths[i-1][j] , lengths[i][j-1])
|
||||
|
||||
return lengths[len(string)][len(sub)]
|
||||
|
||||
class Rouge():
|
||||
'''
|
||||
Class for computing ROUGE-L score for a set of candidate sentences for the MS COCO test set
|
||||
|
||||
'''
|
||||
def __init__(self):
|
||||
# vrama91: updated the value below based on discussion with Hovey
|
||||
self.beta = 1.2
|
||||
|
||||
def calc_score(self, candidate, refs):
|
||||
"""
|
||||
Compute ROUGE-L score given one candidate and references for an image
|
||||
:param candidate: str : candidate sentence to be evaluated
|
||||
:param refs: list of str : COCO reference sentences for the particular image to be evaluated
|
||||
:returns score: int (ROUGE-L score for the candidate evaluated against references)
|
||||
"""
|
||||
assert(len(candidate)==1)
|
||||
assert(len(refs)>0)
|
||||
prec = []
|
||||
rec = []
|
||||
|
||||
# split into tokens
|
||||
token_c = candidate[0].split(" ")
|
||||
|
||||
for reference in refs:
|
||||
# split into tokens
|
||||
token_r = reference.split(" ")
|
||||
# compute the longest common subsequence
|
||||
lcs = my_lcs(token_r, token_c)
|
||||
prec.append(lcs/float(len(token_c)))
|
||||
rec.append(lcs/float(len(token_r)))
|
||||
|
||||
prec_max = max(prec)
|
||||
rec_max = max(rec)
|
||||
|
||||
if(prec_max!=0 and rec_max !=0):
|
||||
score = ((1 + self.beta**2)*prec_max*rec_max)/float(rec_max + self.beta**2*prec_max)
|
||||
else:
|
||||
score = 0.0
|
||||
return score
|
||||
|
||||
def compute_score(self, gts, res):
|
||||
"""
|
||||
Computes Rouge-L score given a set of reference and candidate sentences for the dataset
|
||||
Invoked by evaluate_captions.py
|
||||
:param hypo_for_image: dict : candidate / test sentences with "image name" key and "tokenized sentences" as values
|
||||
:param ref_for_image: dict : reference MS-COCO sentences with "image name" key and "tokenized sentences" as values
|
||||
:returns: average_score: float (mean ROUGE-L score computed by averaging scores for all the images)
|
||||
"""
|
||||
assert(gts.keys() == res.keys())
|
||||
imgIds = gts.keys()
|
||||
|
||||
score = []
|
||||
for id in imgIds:
|
||||
hypo = res[id]
|
||||
ref = gts[id]
|
||||
|
||||
score.append(self.calc_score(hypo, ref))
|
||||
|
||||
# Sanity check.
|
||||
assert(type(hypo) is list)
|
||||
assert(len(hypo) == 1)
|
||||
assert(type(ref) is list)
|
||||
assert(len(ref) > 0)
|
||||
|
||||
average_score = np.mean(np.array(score))
|
||||
print("len score:", len(score))
|
||||
return average_score, np.array(score)
|
||||
|
||||
def method(self):
|
||||
return "Rouge"
|
||||
4
KG Reasoning/ATOMIC2020/system_eval/requirements.txt
Normal file
4
KG Reasoning/ATOMIC2020/system_eval/requirements.txt
Normal file
@@ -0,0 +1,4 @@
|
||||
bert-score
|
||||
tabulate
|
||||
nltk==3.5
|
||||
numpy
|
||||
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
23
KG Reasoning/ATOMIC2020/system_eval/t.tsv
Normal file
23
KG Reasoning/ATOMIC2020/system_eval/t.tsv
Normal file
@@ -0,0 +1,23 @@
|
||||
PersonX also saw ___ @@ isFilledBy the dolphin| the whale| monuments| boss| trees| coworker| the car| flowers| birds
|
||||
wage war @@ HasSubEvent attack enemy| people get killed| reconsider all options| cities destroyed| fight| people become underfed| people protest| write proclamation declaring war| assination| people become injured or killed| become angry| human lives lost| victory| prepare for battle| people in armies would fight| stop fighting| bombs droped| arms makers grow more wealthy| killing enemy soldiers| declare war| win or lose| attrition| get bad| defeat| engage deception| attack opposing armies| innocent people get killed| national borders change| death| people become refugees| people become homeless| negotiate peace| may get into struggle
|
||||
PersonX goes from zero to hero @@ HinderedBy PersonX is a wimp| PersonX got sick from Scott| PersonX has been knocked out by his girlfriend| PersonX does not have the courage to confront the situation.| PersonX does not have the right equipment to defeat the enemy.| PersonX does not have a gym membership| PersonX has no skills| PersonX is afraid| PersonX was made too weak by his mom's diet
|
||||
PersonX is walking on the beach @@ xAttr content| healthy| relaxed| alone| restive| blissful| care-free
|
||||
PersonX cuts open ___ @@ oEffect rushed to hospital| none| operated on
|
||||
PersonX becomes PersonY member @@ xReact included
|
||||
kerosene @@ ObjectUse create heat| kill head lice| soak a rag| clean paint off a wall| clean off grease| burn someone's house who cheated on you| remove a messbecom| disinfect a wound| become arsonist
|
||||
PersonX loves PersonX's dog @@ xEffect want to feed a dog| none| he want get a dog| he interested a pets| he jalouse for another
|
||||
PersonX stops kissing PersonY @@ HinderedBy PersonX is out of control and has become overcome by his sexual desires.| PersonY is holding PersonX down.| They are in love with Person Y| PersonY will leave PersonX.| They are feeling the passion of the moment.| PersonY is crying out of control and PersonX is full of compassion.
|
||||
PersonX looks at PersonY like that @@ isAfter PersonX is on a date with PersonY
|
||||
PersonX quickly fell in love @@ oReact glad| overjoyed| happy.
|
||||
PersonX loves PersonX's work @@ xIntent to be productive.| to be accomplished| have a job they love| none| to succeed.
|
||||
PersonX takes PersonY everywhere @@ xWant take person Y home| to eat dinner| to get PersonY familiar with the area| to rest| refill up with gas| to answer PersonY's questions
|
||||
PersonX gets close to PersonY @@ oWant to spend time with X| to marry X| to bond with x| to reciprocate the feelings| to avoid x| to spend as much time with them as they can
|
||||
jellyfish @@ AtLocation ocean| potato sout| all oceans of world| most oceans| surf| sea at coral reefs| salt water| tidal waters| shores washed up and dead| thai restaurant| monterey bay aquarium| north sea| warm ocean| peanut butter pool| coral reef| marine aquarium| thesand| public aquarium| chesapeake bay| restaurant with strange menu| mediterranean sea| open ocean| zoo| pervet's bedroom| tropical waters| aqurium| see| jelly bean| warm ocean water| detroit zoo| book| ocea| sandwith| gulf| sushi restaurant| encyclopedia| tidal pools| current| pond| weirdest places| cartoon| penny candy aisle| tropical body of water| international waters| lake| atlantic ocean| ocean or aquarium| japanese restaurant| chinese entree| warm sea| art| baltimore aquarium| maui| jamaca| movie| sea water| oriental restaurant| bay| smack| oceans and seas| deep ocean| texas| saltwater| hand| saardina| underwater| hawaii| monterey bay| florida| tank| oceanic trench| chinese restaurant| cuba| jungle| photographs| warm ocean waters| osean| red sea| store| aquarium| calm waters| pacific ocean| sea world| bathing suit| shore| ocean water
|
||||
PersonX makes PersonY's laugh @@ isBefore PersonX gets a drink for PersonY
|
||||
mouse @@ MadeUpOf mouse button| mouse wheel
|
||||
stallion @@ CapableOf service mare| cover mare
|
||||
sand @@ HasProperty found at beach| found in desert| dry and small| found on beach| played in| black| made into glass| found on beaches| white| made up of tiny ground rocks| gritty
|
||||
gardener @@ NotDesires plants to die
|
||||
hot weather @@ Causes fainting
|
||||
stay in bed @@ xReason of cold| were sick| feel ill| you're sick| you're still tired| don't feel well| back hurts| have sore throat
|
||||
hen @@ Desires bread crumbs
|
||||
|
192
KG Reasoning/ATOMIC2020/system_eval/utils.py
Normal file
192
KG Reasoning/ATOMIC2020/system_eval/utils.py
Normal file
@@ -0,0 +1,192 @@
|
||||
import json
|
||||
import sys
|
||||
import csv
|
||||
import operator
|
||||
import random
|
||||
|
||||
|
||||
def read_csv(input_file, quotechar='"', delimiter=",", skip_header=False):
|
||||
"""Reads a tab separated value file."""
|
||||
with open(input_file, "r") as f:
|
||||
reader = csv.reader(f, delimiter=delimiter, quotechar=quotechar, quoting=csv.QUOTE_ALL, skipinitialspace=True)
|
||||
lines = []
|
||||
for line in reader:
|
||||
if sys.version_info[0] == 2:
|
||||
line = list(unicode(cell, 'utf-8') for cell in line)
|
||||
lines.append(line)
|
||||
if skip_header:
|
||||
lines = lines[1:]
|
||||
return lines
|
||||
|
||||
|
||||
def write_tsv(output_file, data, header=False):
|
||||
keys = list(data[0].keys())
|
||||
with open(output_file, 'w') as f:
|
||||
w = csv.DictWriter(f, keys, delimiter='\t', lineterminator='\n')
|
||||
if header:
|
||||
w.writeheader()
|
||||
for r in data:
|
||||
entry = {k: r[k] for k in keys}
|
||||
w.writerow(entry)
|
||||
|
||||
|
||||
def write_array2tsv(output_file, data, header=False):
|
||||
keys = range(len(data[0]))
|
||||
with open(output_file, 'w') as f:
|
||||
w = csv.DictWriter(f, keys, delimiter='\t', lineterminator='\n')
|
||||
if header:
|
||||
w.writeheader()
|
||||
for r in data:
|
||||
entry = {k: r[k] for k in keys}
|
||||
w.writerow(entry)
|
||||
|
||||
|
||||
def write_csv(filename, data, fieldnames):
|
||||
with open(filename, 'w', newline='') as csvfile:
|
||||
writer = csv.DictWriter(csvfile, fieldnames=fieldnames)
|
||||
|
||||
writer.writeheader()
|
||||
for d in data:
|
||||
formatted_d = {}
|
||||
for key, val in d.items():
|
||||
formatted_d[key] = json.dumps(val)
|
||||
writer.writerow(formatted_d)
|
||||
|
||||
|
||||
def read_jsonl(filename):
|
||||
data = []
|
||||
with open(filename, "r") as f:
|
||||
for line in f:
|
||||
data.append(json.loads(line))
|
||||
return data
|
||||
|
||||
|
||||
def write_items(output_file, items):
|
||||
with open(output_file, 'w') as f:
|
||||
for concept in items:
|
||||
f.write(concept + "\n")
|
||||
f.close()
|
||||
|
||||
|
||||
def write_jsonl(f, d):
|
||||
write_items(f, [json.dumps(r) for r in d])
|
||||
|
||||
|
||||
def count_relation(d):
|
||||
relation_count = {}
|
||||
prefix_count = {}
|
||||
head_count = {}
|
||||
for l in d:
|
||||
r = l[1]
|
||||
if r not in relation_count.keys():
|
||||
relation_count[r] = 0
|
||||
relation_count[r] += 1
|
||||
|
||||
prefix = l[0]+l[1]
|
||||
if prefix not in prefix_count.keys():
|
||||
prefix_count[prefix] = 0
|
||||
prefix_count[prefix] += 1
|
||||
|
||||
head = l[0]
|
||||
if head not in head_count.keys():
|
||||
head_count[head] = 0
|
||||
head_count[head] += 1
|
||||
|
||||
sorted_relation_count = dict(sorted(relation_count.items(), key=operator.itemgetter(1), reverse=True))
|
||||
sorted_prefix_count = dict(sorted(prefix_count.items(), key=operator.itemgetter(1), reverse=True))
|
||||
sorted_head_count = dict(sorted(head_count.items(), key=operator.itemgetter(1), reverse=True))
|
||||
|
||||
print("Relations:")
|
||||
for r in sorted_relation_count.keys():
|
||||
print(r, sorted_relation_count[r])
|
||||
|
||||
print("\nPrefixes:")
|
||||
print("uniq prefixes: ", len(sorted_prefix_count.keys()))
|
||||
i = 0
|
||||
for r in sorted_prefix_count.keys():
|
||||
print(r, sorted_prefix_count[r])
|
||||
i += 1
|
||||
if i > 20:
|
||||
break
|
||||
|
||||
print("\nHeads:")
|
||||
i = 0
|
||||
for r in sorted_head_count.keys():
|
||||
print(r, sorted_head_count[r])
|
||||
i += 1
|
||||
if i > 20:
|
||||
break
|
||||
|
||||
|
||||
def get_head_set(d):
|
||||
return set([l[0] for l in d])
|
||||
|
||||
|
||||
def head_based_split(data, dev_size, test_size, head_size_threshold=500, dev_heads=[], test_heads=[]):
|
||||
"""
|
||||
:param data: the tuples to split according to the heads, where the head is the first element of each tuple
|
||||
:param dev_size: target size of the dev set
|
||||
:param test_size: target size of the test set
|
||||
:param head_size_threshold: Maximum number of tuples a head can be involved in,
|
||||
in order to be considered for the dev/test set'
|
||||
:param dev_heads: heads that are forced to belong to the dev set
|
||||
:param test_heads: heads that are forced to belong to the test set
|
||||
:return:
|
||||
"""
|
||||
head_count = {}
|
||||
for l in data:
|
||||
head = l[0]
|
||||
if head not in head_count.keys():
|
||||
head_count[head] = 0
|
||||
head_count[head] += 1
|
||||
|
||||
remaining_heads = dict(head_count)
|
||||
|
||||
test_selected_heads = {}
|
||||
test_head_total_count = 0
|
||||
|
||||
for h in test_heads:
|
||||
if h in remaining_heads:
|
||||
c = remaining_heads[h]
|
||||
test_selected_heads[h] = c
|
||||
test_head_total_count += c
|
||||
remaining_heads.pop(h)
|
||||
|
||||
while test_head_total_count < test_size:
|
||||
h = random.sample(remaining_heads.keys(), 1)[0]
|
||||
c = remaining_heads[h]
|
||||
if c < head_size_threshold:
|
||||
test_selected_heads[h] = c
|
||||
test_head_total_count += c
|
||||
remaining_heads.pop(h)
|
||||
|
||||
test = [l for l in data if l[0] in test_selected_heads.keys()]
|
||||
|
||||
dev_selected_heads = {}
|
||||
dev_head_total_count = 0
|
||||
|
||||
for h in dev_heads:
|
||||
if h in remaining_heads:
|
||||
c = remaining_heads[h]
|
||||
dev_selected_heads[h] = c
|
||||
dev_head_total_count += c
|
||||
remaining_heads.pop(h)
|
||||
|
||||
while dev_head_total_count < dev_size:
|
||||
h = random.sample(remaining_heads.keys(), 1)[0]
|
||||
c = remaining_heads[h]
|
||||
if c < head_size_threshold:
|
||||
dev_selected_heads[h] = c
|
||||
dev_head_total_count += c
|
||||
remaining_heads.pop(h)
|
||||
|
||||
dev = [l for l in data if l[0] in dev_selected_heads.keys()]
|
||||
|
||||
dev_test_heads = set(list(dev_selected_heads.keys()) + list(test_selected_heads.keys()))
|
||||
train = [l for l in data if l[0] not in dev_test_heads]
|
||||
|
||||
return train, dev, test
|
||||
|
||||
|
||||
def remove_prefix(text, prefix):
|
||||
return text[text.startswith(prefix) and len(prefix):]
|
||||
25
KG Reasoning/FB15k-237/data/sample.txt
Normal file
25
KG Reasoning/FB15k-237/data/sample.txt
Normal file
@@ -0,0 +1,25 @@
|
||||
China|military military combatant military conflicts. military military combatant group combatants|North Vietnam
|
||||
Babylon 5: The Gathering|film film distributors. film film film distributor relationship film distribution medium|VHS
|
||||
Consultant|people profession specialization of|Businessperson-GB
|
||||
Berkshire Hathaway|award ranked item appears in ranked lists. award ranking list|Fortune 1000
|
||||
Jackass: The Movie|film film release date s. film film regional release date film release distribution medium|DVD
|
||||
Vice President-GB|business job title people with this title. business employment tenure company|American International Group
|
||||
29th Academy Awards|time event instance of recurring event|Academy Awards
|
||||
Angel|tv tv program languages|English Language
|
||||
Stevie Wonder|music group member membership. music group membership group|U.S.A. for Africa
|
||||
Primetime Emmy Award for Outstanding Guest Actress - Comedy Series|award award category category of|Primetime Emmy Award
|
||||
Brian Drummond|film actor dubbing performances. film dubbing performance language|English Language
|
||||
Musician-GB|tv non character role tv regular personal appearances. tv tv regular personal appearance person|Itzhak Perlman
|
||||
House|tv tv program genre|Drama
|
||||
40th Academy Awards|time event locations|Los Angeles
|
||||
The Tourist|film film produced by|Gary Barber
|
||||
South Sudan|base aareas schema administrative area administrative area type|Sovereign state
|
||||
Outfielder|sports sports position players. sports sports team roster team|Boston Red Sox
|
||||
Charles Cornwallis, 1st Marquess Cornwallis|government politician government positions held. government government position held jurisdiction of office|British Raj
|
||||
Switzerland|olympics olympic participating country medals won. olympics olympic medal honor olympics|1984 Winter Olympics
|
||||
Andrew Jackson|government politician government positions held. government government position held basic title|President
|
||||
Photography|education field of study students majoring. education education major field of study|Psychology
|
||||
The Dark Knight Rises|film film costume design by|Lindy Hemming
|
||||
Vibraphone|music performance role regular performances. music group membership role|Baritone
|
||||
Four Weddings and a Funeral|award award winning work awards won. award award honor award|BAFTA Award for Best Direction
|
||||
Saturday Night's Main Event|tv tv program country of origin|United States of America
|
||||
25
KG Reasoning/FB15k-237/prompts/prompt_0.txt
Normal file
25
KG Reasoning/FB15k-237/prompts/prompt_0.txt
Normal file
@@ -0,0 +1,25 @@
|
||||
predict the tail entity [MASK] from the given (China, military military combatant military conflicts. military military combatant group combatants, [MASK]) by completing the sentence "what is the combatants of China? The answer is".
|
||||
predict the tail entity [MASK] from the given (Babylon 5: The Gathering, film film distributors. film film film distributor relationship film distribution medium, [MASK]) by completing the sentence "what is the film distribution medium of Babylon 5: The Gathering? The answer is".
|
||||
predict the tail entity [MASK] from the given (Consultant, people profession specialization of, [MASK]) by completing the sentence "what is the specialization of of Consultant? The answer is".
|
||||
predict the tail entity [MASK] from the given (Berkshire Hathaway, award ranked item appears in ranked lists. award ranking list, [MASK]) by completing the sentence "what is the list of Berkshire Hathaway? The answer is".
|
||||
predict the tail entity [MASK] from the given (Jackass: The Movie, film film release date s. film film regional release date film release distribution medium, [MASK]) by completing the sentence "what is the film release distribution medium of Jackass: The Movie? The answer is".
|
||||
predict the tail entity [MASK] from the given (Vice President-GB, business job title people with this title. business employment tenure company, [MASK]) by completing the sentence "what is the company of Vice President-GB? The answer is".
|
||||
predict the tail entity [MASK] from the given (29th Academy Awards, time event instance of recurring event, [MASK]) by completing the sentence "what is the instance of recurring event of 29th Academy Awards? The answer is".
|
||||
predict the tail entity [MASK] from the given (Angel, tv tv program languages, [MASK]) by completing the sentence "what is the languages of Angel? The answer is".
|
||||
predict the tail entity [MASK] from the given (Stevie Wonder, music group member membership. music group membership group, [MASK]) by completing the sentence "what is the group of Stevie Wonder? The answer is".
|
||||
predict the tail entity [MASK] from the given (Primetime Emmy Award for Outstanding Guest Actress - Comedy Series, award award category category of, [MASK]) by completing the sentence "what is the category of of Primetime Emmy Award for Outstanding Guest Actress - Comedy Series? The answer is".
|
||||
predict the tail entity [MASK] from the given (Brian Drummond, film dubbing performance language, [MASK]) by completing the sentence "what is the language of Brian Drummond? The answer is".
|
||||
predict the tail entity [MASK] from the given (Musician-GB, tv non character role tv regular personal appearances. tv tv regular personal appearance person, [MASK]) by completing the sentence "what is the person of Musician-GB? The answer is".
|
||||
predict the tail entity [MASK] from the given (House, tv tv program genre, [MASK]) by completing the sentence "what is the genre of House? The answer is".
|
||||
predict the tail entity [MASK] from the given (40th Academy Awards, time event locations, [MASK]) by completing the sentence "what is the locations of 40th Academy Awards? The answer is".
|
||||
predict the tail entity [MASK] from the given (The Tourist, film film produced by, [MASK]) by completing the sentence "what is the produced by of The Tourist? The answer is".
|
||||
predict the tail entity [MASK] from the given (South Sudan, base areas schema administrative area administrative area type, [MASK]) by completing the sentence "what is the administrative area type of South Sudan? The answer is".
|
||||
predict the tail entity [MASK] from the given (Outfielder, sports sports position players. sports sports team roster team, [MASK]) by completing the sentence "what is the team of Outfielder? The answer is".
|
||||
predict the tail entity [MASK] from the given (Charles Cornwallis, 1st Marquess Cornwallis, government politician government positions held. government government position held jurisdiction of office, [MASK]) by completing the sentence "what is the jurisdiction of office of Charles Cornwallis, 1st Marquess Cornwallis? The answer is".
|
||||
predict the tail entity [MASK] from the given (Switzerland, olympics olympic participating country medals won. olympics olympic medal honor olympics, [MASK]) by completing the sentence "what is the olympics of Switzerland? The answer is".
|
||||
predict the tail entity [MASK] from the given (Andrew Jackson, government politician government positions held. government government position held basic title, [MASK]) by completing the sentence "what is the basic title of Andrew Jackson? The answer is".
|
||||
predict the tail entity [MASK] from the given (Photography, education field of study students majoring. education education major field of study, [MASK]) by completing the sentence "what is the major field of study of Photography? The answer is".
|
||||
predict the tail entity [MASK] from the given (The Dark Knight Rises, film film costume design by, [MASK]) by completing the sentence "what is the costume design by of The Dark Knight Rises? The answer is".
|
||||
predict the tail entity [MASK] from the given (Vibraphone, music performance role regular performances. music group membership role, [MASK]) by completing the sentence "what is the role of Vibraphone? The answer is".
|
||||
predict the tail entity [MASK] from the given (Four Weddings and a Funeral, award award winning work awards won. award award honor award, [MASK]) by completing the sentence "what is the award of Four Weddings and a Funeral? The answer is".
|
||||
predict the tail entity [MASK] from the given (Saturday Night's Main Event, tv tv program country of origin, [MASK]) by completing the sentence "what is the country of origin of Saturday Night's Main Event? The answer is".
|
||||
68
KG Reasoning/FB15k-237/prompts/prompt_1.txt
Normal file
68
KG Reasoning/FB15k-237/prompts/prompt_1.txt
Normal file
@@ -0,0 +1,68 @@
|
||||
predict the tail entity [MASK] from the given (Yemen-US, military military combatant military conflicts. military military combatant group combatants, [MASK]) by completing the sentence "what is the combatants of Yemen-US? The answer is ".The answer is Saudi Arabia, so the [MASK] is Saudi Arabia.
|
||||
predict the tail entity [MASK] from the given (China, military military combatant military conflicts. military military combatant group combatants, [MASK]) by completing the sentence "what is the combatants of China? The answer is".
|
||||
|
||||
predict the tail entity [MASK] from the given (Harry Potter and the Chamber of Secrets, film film distributors. film film film distributor relationship film distribution medium, [MASK]) by completing the sentence "what is the film distribution medium of Harry Potter and the Chamber of Secrets? The answer is ".The answer is VHS, so the [MASK] is VHS.
|
||||
predict the tail entity [MASK] from the given (Babylon 5: The Gathering, film film distributors. film film film distributor relationship film distribution medium, [MASK]) by completing the sentence "what is the film distribution medium of Babylon 5: The Gathering? The answer is".
|
||||
|
||||
predict the tail entity [MASK] from the given (Research Associate, people profession specialization of, [MASK]) by completing the sentence "what is the specialization of of Research Associate? The answer is ".The answer is Architecture, so the [MASK] is Architecture.
|
||||
predict the tail entity [MASK] from the given (Consultant, people profession specialization of, [MASK]) by completing the sentence "what is the specialization of of Consultant? The answer is".
|
||||
|
||||
predict the tail entity [MASK] from the given (University of Southern California, award ranked item appears in ranked lists. award ranking list, [MASK]) by completing the sentence "what is the list of University of Southern California? The answer is ".The answer is Times Higher Education World University Rankings, so the [MASK] is Times Higher Education World University Rankings.
|
||||
predict the tail entity [MASK] from the given (Berkshire Hathaway, award ranked item appears in ranked lists. award ranking list, [MASK]) by completing the sentence "what is the list of Berkshire Hathaway? The answer is".
|
||||
|
||||
predict the tail entity [MASK] from the given (Vanilla Sky, film film release date s. film film regional release date film release distribution medium, [MASK]) by completing the sentence "what is the film release distribution medium of Vanilla Sky? The answer is ".The answer is DVD, so the [MASK] is DVD.
|
||||
predict the tail entity [MASK] from the given (Jackass: The Movie, film film release date s. film film regional release date film release distribution medium, [MASK]) by completing the sentence "what is the film release distribution medium of Jackass: The Movie? The answer is".
|
||||
|
||||
predict the tail entity [MASK] from the given (Managing Director-GB, business job title people with this title. business employment tenure company, [MASK]) by completing the sentence "what is the company of Managing Director-GB? The answer is ".The answer is Marriott International, so the [MASK] is Marriott International.
|
||||
predict the tail entity [MASK] from the given (Vice President-GB, business job title people with this title. business employment tenure company, [MASK]) by completing the sentence "what is the company of Vice President-GB? The answer is".
|
||||
|
||||
predict the tail entity [MASK] from the given (John Adams, tv tv program languages, [MASK]) by completing the sentence "what is the languages of John Adams? The answer is ".The answer is English Language, so the [MASK] is English Language.
|
||||
predict the tail entity [MASK] from the given (Angel, tv tv program languages, [MASK]) by completing the sentence "what is the languages of Angel? The answer is".
|
||||
|
||||
predict the tail entity [MASK] from the given (Dave Grohl, music group member membership. music group membership group, [MASK]) by completing the sentence "what is the group of Dave Grohl? The answer is ".The answer is Queens of the Stone Age, so the [MASK] is Queens of the Stone Age.
|
||||
predict the tail entity [MASK] from the given (Stevie Wonder, music group member membership. music group membership group, [MASK]) by completing the sentence "what is the group of Stevie Wonder? The answer is".
|
||||
|
||||
predict the tail entity [MASK] from the given (Academy Award for Best Film Editing, award award category category of, [MASK]) by completing the sentence "what is the category of of Academy Award for Best Film Editing? The answer is ".The answer is Academy Awards, so the [MASK] is Academy Awards.
|
||||
predict the tail entity [MASK] from the given (Primetime Emmy Award for Outstanding Guest Actress - Comedy Series, award award category category of, [MASK]) by completing the sentence "what is the category of of Primetime Emmy Award for Outstanding Guest Actress - Comedy Series? The answer is".
|
||||
|
||||
predict the tail entity [MASK] from the given (Billy Crystal, film actor dubbing performances. film dubbing performance language, [MASK]) by completing the sentence "what is the language of Billy Crystal? The answer is ".The answer is English Language, so the [MASK] is English Language.
|
||||
predict the tail entity [MASK] from the given (Brian Drummond, film actor dubbing performances. film dubbing performance language, [MASK]) by completing the sentence "what is the language of Brian Drummond? The answer is".
|
||||
|
||||
predict the tail entity [MASK] from the given (Guest host, tv non character role tv regular personal appearances. tv tv regular personal appearance person, [MASK]) by completing the sentence "what is the person of Guest host? The answer is ".The answer is Bill Cosby, so the [MASK] is Bill Cosby.
|
||||
predict the tail entity [MASK] from the given (Musician-GB, tv non character role tv regular personal appearances. tv tv regular personal appearance person, [MASK]) by completing the sentence "what is the person of Musician-GB? The answer is".
|
||||
|
||||
predict the tail entity [MASK] from the given (Twin Peaks, tv tv program genre, [MASK]) by completing the sentence "what is the genre of Twin Peaks? The answer is ".The answer is Cult film, so the [MASK] is Cult film
|
||||
predict the tail entity [MASK] from the given (House, tv tv program genre, [MASK]) by completing the sentence "what is the genre of House? The answer is".
|
||||
|
||||
predict the tail entity [MASK] from the given (1992 NCAA Men's Division I Basketball Tournament, time event locations, [MASK]) by completing the sentence "what is the locations of 1992 NCAA Men's Division I Basketball Tournament? The answer is ".The answer is Albuquerque, so the [MASK] is Albuquerque.
|
||||
predict the tail entity [MASK] from the given (40th Academy Awards, time event locations, [MASK]) by completing the sentence "what is the locations of 40th Academy Awards? The answer is".
|
||||
|
||||
predict the tail entity [MASK] from the given (Chaplin, film film produced by, [MASK]) by completing the sentence "what is the produced by of Chaplin? The answer is ".The answer is Mario Kassar, so the [MASK] is Mario Kassar.
|
||||
predict the tail entity [MASK] from the given (The Tourist, film film produced by, [MASK]) by completing the sentence "what is the produced by of The Tourist? The answer is".
|
||||
|
||||
predict the tail entity [MASK] from the given (Saudi Arabia, base aareas schema administrative area administrative area type, [MASK]) by completing the sentence "what is the administrative area type of Saudi Arabia? The answer is ".The answer is Sovereign state, so the [MASK] is Sovereign state.
|
||||
predict the tail entity [MASK] from the given (South Sudan, base areas schema administrative area administrative area type, [MASK]) by completing the sentence "what is the administrative area type of South Sudan? The answer is".
|
||||
|
||||
predict the tail entity [MASK] from the given (Midfielder, sports sports position players. sports sports team roster team, [MASK]) by completing the sentence "what is the team of Midfielder? The answer is ".The answer is Clube de Regatas do Flamengo, so the [MASK] is Clube de Regatas do Flamengo.
|
||||
predict the tail entity [MASK] from the given (Outfielder, sports sports position players. sports sports team roster team, [MASK]) by completing the sentence "what is the team of Outfielder? The answer is".
|
||||
|
||||
predict the tail entity [MASK] from the given (United States of America, olympics olympic participating country medals won. olympics olympic medal honor olympics, [MASK]) by completing the sentence "what is the olympics of United States of America? The answer is ".The answer is 2008 Summer Olympics, so the [MASK] is 2008 Summer Olympics.
|
||||
predict the tail entity [MASK] from the given (Switzerland, olympics olympic participating country medals won. olympics olympic medal honor olympics, [MASK]) by completing the sentence "what is the olympics of Switzerland? The answer is".
|
||||
|
||||
predict the tail entity [MASK] from the given (Bill Clinton, government politician government positions held. government government position held basic title, [MASK]) by completing the sentence "what is the basic title of Bill Clinton? The answer is ".The answer is Governor-GB, so the [MASK] is Governor-GB.
|
||||
predict the tail entity [MASK] from the given (Andrew Jackson, government politician government positions held. government government position held basic title, [MASK]) by completing the sentence "what is the basic title of Andrew Jackson? The answer is".
|
||||
|
||||
predict the tail entity [MASK] from the given (Economics, education field of study students majoring. education education major field of study, [MASK]) by completing the sentence "what is the major field of study of Economics? The answer is ".The answer is Computer Science, so the [MASK] is Computer Science.
|
||||
predict the tail entity [MASK] from the given (Photography, education field of study students majoring. education education major field of study, [MASK]) by completing the sentence "what is the major field of study of Photography? The answer is".
|
||||
|
||||
predict the tail entity [MASK] from the given (The Aviator, film film costume design by, [MASK]) by completing the sentence "what is the costume design by of The Aviator? The answer is ".The answer is Sandy Powell, so the [MASK] is Sandy Powell.
|
||||
predict the tail entity [MASK] from the given (The Dark Knight Rises, film film costume design by, [MASK]) by completing the sentence "what is the costume design by of The Dark Knight Rises? The answer is".
|
||||
|
||||
predict the tail entity [MASK] from the given (Tambourine, music performance role regular performances. music group membership role, [MASK]) by completing the sentence "what is the role of Tambourine? The answer is ".The answer is Fiddle, so the [MASK] is Fiddle.
|
||||
predict the tail entity [MASK] from the given (Vibraphone, music performance role regular performances. music group membership role, [MASK]) by completing the sentence "what is the role of Vibraphone? The answer is".
|
||||
|
||||
predict the tail entity [MASK] from the given (The Pianist, award award winning work awards won. award1st Marquess Cornwallis award honor award, [MASK]) by completing the sentence "what is the award of The Pianist? The answer is ".The answer is Japan Academy Prize for Outstanding Foreign Language Film, so the [MASK] is Japan Academy Prize for Outstanding Foreign Language Film.
|
||||
predict the tail entity [MASK] from the given (Four Weddings and a Funeral, award award winning work awards won. award award honor award, [MASK]) by completing the sentence "what is the award of Four Weddings and a Funeral? The answer is".
|
||||
|
||||
predict the tail entity [MASK] from the given (Sex and the City, tv tv program country of origin, [MASK]) by completing the sentence "what is the country of origin of Sex and the City? The answer is ".The answer is United States of America, so the [MASK] is United States of America.
|
||||
predict the tail entity [MASK] from the given (Saturday Night's Main Event, tv tv program country of origin, [MASK]) by completing the sentence "what is the country of origin of Saturday Night's Main Event? The answer is".
|
||||
Reference in New Issue
Block a user