Skip to Content

Emerging terms: 'buzz' tags with highest recency score (RS) over last 52 weeks

Syndicate content

This page provides an overview of 228 keyword tags in Ariadne, ordered by recency score.

Note: filters may be applied to display a sub-set of tags in this category; see FAQs on filtering for usage tips. Select this link to remove all filters.

Termsort icon Description Recency score (RS) Charts

mashup

In Web development, a mashup is a Web page or application that uses and combines data, presentation or functionality from two or more sources to create new services. The term implies easy, fast integration, frequently using open APIs and data sources to produce enriched results that were not necessarily the original reason for producing the raw source data. The main characteristics of the mashup are combination, visualization, and aggregation. It is important to make existing data more useful, moreover for personal and professional use. (Excerpt from Wikipedia article: Mashup)

3.8

managerialism

Managerialism is the ideological principle that societies are equivalent to the sum of the transactions made by the managements of organizations. "The main origin of Managerialism lay in the human relations movement that took root at the Harvard Business School in the 1920s and 1930s under the guiding hand of Professor Elton Mayo. Mayo, an immigrant from Australia, saw democracy as divisive and lacking in community spirit. He looked to corporate managers to restore the social harmony that he believed the uprooting experiences of immigration and industrialization had destroyed and that democracy was incapable of repairing. (Excerpt from Wikipedia article: Managerialism)

25

lod

Linked Open Data (LOD) is part of the Open Data Movement, which aims to make data freely available to everyone. There are already various interesting open data sets available on the Web. Examples include Wikipedia, Wikibooks, Geonames, MusicBrainz, WordNet, the DBLP bibliography and many more which are published under Creative Commons or Talis licenses. The goal of the W3C SWEO Linking Open Data community project is to extend the Web with a data commons by publishing various open data sets as RDF on the Web and by setting RDF links between data items from different data sources. (Excerpt from this source)

17.2

local storage

What I will refer to as 'HTML5 Storage' is a specification named Web Storage, which was at one time part of the HTML5 specification proper, but was split out into its own specification for uninteresting political reasons. Certain browser vendors also refer to it as 'Local Storage' or 'DOM Storage.' Simply put, it's a way for web pages to store named key/value pairs locally, within the client web browser. Like cookies, this data persists even after you navigate away from the web site, close your browser tab, exit your browser, or what have you. Unlike cookies, this data is never transmitted to the remote web server (unless you go out of your way to send it manually). Unlike all previous attempts at providing persistent local storage, it is implemented natively in web browsers, so it is available even when third-party browser plugins are not. (Excerpt from this source)

63.6

linked data

Linked Data describes a method of publishing structured data, so that it can be interlinked and become more useful. It builds upon standard Web technologies, such as HTTP and URIs - but rather than using them to serve web pages for human readers, it extends them to share information in a way that can be read automatically by computers. This enables data from different sources to be connected and queried. Tim Berners-Lee, director of the World Wide Web Consortium, coined the term in a design note discussing issues around the Semantic Web project. (Excerpt from Wikipedia article: Linked Data)

25.8

licence

The verb license or grant licence means to give permission. The noun license (American English) or licence (British English) refers to that permission as well as to the document recording that permission. A license may be granted by a party ("licensor") to another party ("licensee") as an element of an agreement between those parties. A shorthand definition of a license is "an authorization (by the licensor) to use the licensed material (by the licensee)." In particular a license may be issued by authorities, to allow an activity that would otherwise be forbidden. It may require paying a fee and/or proving a capability. The requirement may also serve to keep the authorities informed on a type of activity, and to give them the opportunity to set conditions and limitations. (Excerpt from Wikipedia article: License)

1.2

librarything

LibraryThing is a social cataloging web application for storing and sharing book catalogs and various types of book metadata. It is used by individuals, authors, libraries and publishers. Based in Portland, Maine, LibraryThing was developed by Tim Spalding and went live on August 29, 2005. As of April 2011 it has over 1,300,000 users and more than 61 million books catalogued. The primary feature of LibraryThing is the cataloging of books by importing data from libraries through Z39.50 connections and from six Amazon.com stores. Library sources supply MARC and Dublin Core records to LT; users can import information from 690 libraries, including the Library of Congress, National Library of Australia, the Canadian National Catalogue, the British Library, and Yale University. Should a record not be available from any of these sources, it is also possible to add the book information by using a blank form. (Excerpt from Wikipedia article: LibraryThing)

7.1

library management systems

An integrated library system (ILS), also known as a library management system (LMS), is an enterprise resource planning system for a library, used to track items owned, orders made, bills paid, and patrons who have borrowed. An ILS usually comprises a relational database, software to interact with that database, and two graphical user interfaces (one for patrons, one for staff). Most ILSes separate software functions into discrete programs called modules, each of them integrated with a unified interface. Examples of modules might include: acquisitions (ordering, receiving, and invoicing materials); cataloging (classifying and indexing materials); circulation (lending materials to patrons and receiving them back); serials (tracking magazine and newspaper holdings); the OPAC (public interface for users). Each patron and item has a unique ID in the database that allows the ILS to track its activity. Larger libraries use an ILS to order and acquire, receive and invoice, catalog, circulate, track and shelve materials. (Excerpt from Wikipedia article: Library management system)

8.6

learning objects

A learning object is "a collection of content items, practice items, and assessment items that are combined based on a single learning objective". The term is credited to Wayne Hogins when he created a working group in 1994 bearing the name though the concept was first described by Gerard in 1967. Learning objects go by many names, including content objects, chunks, educational objects, information objects, intelligent objects, knowledge bits, knowledge objects, learning components, media objects, reusable curriculum components, nuggets, reusable information objects, reusable learning objects, testable reusable units of cognition, training components, and units of learning. Learning objects offer a new conceptualization of the learning process: rather than the traditional "several hour chunk", they provide smaller, self-contained, re-usable units of learning. They will typically have a number of different components, which range from descriptive data to information about rights and educational level. At their core, however, will be instructional content, practice, and assessment. A key issue is the use of metadata. Learning object design raises issues of portability, and of the object's relation to a broader learning management system. (Excerpt from Wikipedia article: Learning Objects)

1.5

learning management system

A learning management system (commonly abbreviated as LMS) is a software application for the administration, documentation, tracking, and reporting of training programs, classroom and online events, e-learning programs, and training content. As described in (Ellis 2009) a robust LMS should be able to do the following: centralize and automate administration; use self-service and self-guided services; assemble and deliver learning content rapidly; consolidate training initiatives on a scalable web-based platform; support portability and standards; personalize content and enable knowledge reuse. (Excerpt from Wikipedia article: Learning management system)

4

learning design

Instructional Design (also called Instructional Systems Design (ISD)) is the practice of maximizing the effectiveness, efficiency and appeal of instruction and other learning experiences. The process consists broadly of determining the current state and needs of the learner, defining the end goal of instruction, and creating some "intervention" to assist in the transition. Ideally the process is informed by pedagogically (process of teaching) and andragogically (adult learning) tested theories of learning and may take place in student-only, teacher-led or community-based settings. The outcome of this instruction may be directly observable and scientifically measured or completely hidden and assumed. There are many instructional design models but many are based on the ADDIE model with the five phases: 1) analysis, 2) design, 3) development, 4) implementation, and 5) evaluation. As a field, instructional design is historically and traditionally rooted in cognitive and behavioral psychology. (Excerpt from Wikipedia article: Instructional design)

20

learning analytics

Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs. A related field is educational data mining. Differentiating the fields of educational data mining (EDM) and learning analytics (LA) has been a concern of several researchers. George Siemens takes the position that educational data mining encompasses both learning analytics and academic analytics, the former of which is aimed at governments, funding agencies, and administrators instead of learners and faculty. Baepler and Murdoch, define academic analytics as an area that "...combines select institutional data, statistical analysis, and predictive modeling to create intelligence upon which learners, instructors, or administrators can change academic behavior". They go on to attempt to disambiguate educational data mining from academic analytics based on whether the process is hypothesis driven or not, though Brooks questions whether this distinction exists in the literature. Brooks instead proposes that the a better distinction between the EDM and LA communities is in the roots of where each community originated, with authorship at the EDM community being dominated by researchers coming from intelligent tutoring paradigms, and learning anaytics researchers being more focused on enterprise learning systems (e.g. learning content management systems). Regardless of the differences between the LA and EDM communities, the two areas have significant overlap both in the objectives of investigators as well as in the methods and techniques that are used in the investigation. (Excerpt from Wikipedia article: Learning analytics)

97.4

ldap

The Lightweight Directory Access Protocol (LDAP) is an application protocol for accessing and maintaining distributed directory information services over an Internet Protocol (IP) network. Directory services may provide any organized set of records, often with a hierarchical structure, such as a corporate email directory. LDAP is specified in a series of Internet Engineering Task Force (IETF) Standard Track Request for Comments (RFCs), using the description language ASN.1. An LDAP server may return referrals to other servers for requests that it cannot fulfill itself. This requires a naming structure for LDAP entries so one can find a server holding a given DN or distinguished name, a concept defined in the X.500 Directory and also used in LDAP. (Excerpt from Wikipedia article: LDAP)

1.4

lcsh

The Library of Congress Subject Headings (LCSH) comprise a thesaurus (in the information technology sense) of subject headings, maintained by the United States Library of Congress, for use in bibliographic records. LC Subject Headings are an integral part of bibliographic control, which is the function by which libraries collect, organize and disseminate documents. LCSHs are applied to every item within a library's collection, and facilitate a user's access to items in the catalogue that pertain to similar subject matter. If users could only locate items by 'title' or other descriptive fields, such as 'author' or 'publisher', they would have to expend an enormous amount of time searching for items of related subject matter, and undoubtedly miss locating many items because of the ineffective and inefficient search capability. (Excerpt from Wikipedia article: Library of Congress Subject Headings)

1.2

knowledge base

A knowledge base (abbreviated KB) is a special kind of database for knowledge management, providing the means for the computerized collection, organization, and retrieval of knowledge. Also a collection of data representing related experiences, their results are related to their problems and solutions. (Excerpt from Wikipedia article: Knowledge base)

2.4

jstor

JSTOR (pronounced jay-stor; short for Journal Storage) is a digital library founded in 1995. Originally containing digitized back issues of academic journals, it now also includes books and primary sources, and current issues of journals. It provides full-text searches of more than a thousand journals, dating back to 1665 in the case of the Philosophical Transactions of the Royal Society. More than 7,000 institutions in more than 150 countries have access to JSTOR. Most access is by subscription, but some old public domain content is freely available to anyone, and in 2012 JSTOR launched a program of free access to some further articles for individual scholars and researchers who register. JSTOR was originally funded by the Andrew W. Mellon Foundation, but is now an independent, self-sustaining not-for-profit organization with offices in New York City and Ann Arbor, Michigan. In January 2009, JSTOR merged with ITHAKA, becoming part of that organization. ITHAKA is a non-profit organization founded in 2003 "dedicated to helping the academic community take full advantage of rapidly advancing information and networking technologies". (Excerpt from Wikipedia article: JSTOR)

0.8

json

JSON is a lightweight text-based open standard designed for human-readable data interchange. It is derived from the JavaScript programming language for representing simple data structures and associative arrays, called objects. Despite its relationship to JavaScript, it is language-independent, with parsers available for most programming languages. The JSON format was originally specified by Douglas Crockford, and is described in RFC 4627. The official Internet media type for JSON is application/json. The JSON filename extension is .json. The JSON format is often used for serializing and transmitting structured data over a network connection. It is primarily used to transmit data between a server and web application, serving as an alternative to XML. (Excerpt from Wikipedia article: JSON)

30.8

jquery

jQuery is a cross-browser JavaScript library designed to simplify the client-side scripting of HTML. It was released in January 2006 at BarCamp NYC by John Resig. Used by over 43% of the 10,000 most visited websites, jQuery is the most popular JavaScript library in use today. jQuery is free, open source software, dual-licensed under the MIT License and the GNU General Public License, Version 2. jQuery's syntax is designed to make it easier to navigate a document, select DOM elements, create animations, handle events, and develop Ajax applications. jQuery also provides capabilities for developers to create plugins on top of the JavaScript library. (Excerpt from Wikipedia article: jQuery)

79.3

jpeg 2000

JPEG 2000 is an image compression standard and coding system. It was created by the Joint Photographic Experts Group committee in 2000 with the intention of superseding their original discrete cosine transform-based JPEG standard (created in 1992) with a newly designed, wavelet-based method. (Excerpt from Wikipedia article: JPEG 2000)

80

jpeg

In computing, JPEG is a commonly used method of lossy compression for digital photography (image). The degree of compression can be adjusted, allowing a selectable tradeoff between storage size and image quality. JPEG typically achieves 10:1 compression with little perceptible loss in image quality. (Excerpt from Wikipedia article: JPEG)

11.3
CSVXML
Syndicate content


about seo