recentpopularlog in

edavey : semantic_web   9

platform/docs/rfcs/010-data_model at master · wellcometrust/platform
Wellcome Collection

As part of the platform development we create ontologies that describe our collections, events and editorial content as a unified graph of linked data. By using domain modelling and thinking about data as a semantic graph of typed entities and relationships, it helps us to create more richly linked digital experiences that aid exploration and discovery.

The ontologies are documented using OWL.
rdf  semantic_web 
8 weeks ago by edavey
IPTC Core and Extension Guidelines
Field Reference Table
This section provides a table which should help finding the guidelines for the field you are searching for:
All field names appear in alphabetical order
Each field name is linked to the Guidelines section which describes this field. In electronic documents (Web page, PDF) click on the name and follow.
"Other names" show names differently to the IPTC specifications but used by popular software.
"Scheme" shows IPTC Core or IPTC Extension which is the schema this field pertains to.
"XMP Id" shows the identifier used by XMP (without a namespace) for a quick look into the XMP source.
images  photos  semantic_web  metadata  iptc 
november 2011 by edavey
LawrenceWoodman/mida - GitHub
A Microdata parser and extractor library for ruby. This is based on the latest Published version of the Microdata Specification dated 5th April 2011.
microdata  schema.org  seo  semantic_web 
october 2011 by edavey
Microdata - Dive Into HTML5
Defining your own microdata vocabulary is easy. First, you need a namespace, which is just a URL. The namespace URL could actually point to a working web page, although that’s not strictly required. Let’s say I want to create a microdata vocabulary that describes a person. If I own the data-vocabulary.org domain, I’ll use the URL http://data-vocabulary.org/Person as the namespace for my microdata vocabulary. That’s an easy way to create a globally unique identifier: pick a URL on a domain that you control
seo  semantic_web  html5 
august 2011 by edavey
schema.org FAQ - Webmaster Tools Help
schema.org is a collaboration by Google, Microsoft, and Yahoo! to improve the web by creating a structured data markup schema supported by major search engines. On-page markup helps search engines understand the information on webpages and provide richer results. A shared markup vocabulary makes it easier for webmasters to decide on a markup schema and get maximum benefit for their efforts.
seo  semantic_web 
august 2011 by edavey
CreativeWork - schema.org
What is Schema.org?
This site provides a collection of schemas, i.e., html tags, that webmasters can use to markup their pages in ways recognized by major search providers. Search engines including Bing, Google and Yahoo! rely on this markup to improve the display of search results, making it easier for people to find the right web pages.
Many sites are generated from structured data, which is often stored in databases. When this data is formatted into HTML, it becomes very difficult to recover the original structured data. Many applications, especially search engines, can benefit greatly from direct access to this structured data. On-page markup enables search engines to understand the information on web pages and provide richer search results in order to make it easier for users to find relevant information on the web. Markup can also enable new tools and applications that make use of the structure.
A shared markup vocabulary makes easier for webmasters to decide on a markup schema and get the maximum benefit for their efforts. So, in the spirit of sitemaps.org, Bing, Google and Yahoo! have come together to provide a shared collection of schemas that webmasters can use.
seo  semantic_web 
august 2011 by edavey
Contextus
Describing Narrative in the Digital World
semantic_web  narrative 
january 2011 by edavey
Extractiv
Extractiv lets you transform unstructured web content into highly-structured semantic data.

With our powerful web crawling and text extraction tools, you can:

Crawl millions of domains every hour
Extract tons of semantic information from the content on those domains
Do it all at highly-affordable prices
scraping  semantic_web  development 
november 2010 by edavey

Copy this bookmark:





to read