This paper is a presentation of the FETA framework and new work with Naomi Arnold on time varying models.
This paper describes the Raphtory system which is used to analysis large-scale time-varying graph systems. It can ingest streaming graph information and store the complete graph history. It enables queries to be made over the graphs at different points in that graph's history.
This work in progress was accepted as a Demo and at the Doctoral workshop for DEBS (Distributed and Event-Based Systems). It shows the early development of a system that ingests events and can create (and eventually query) a dynamic graph.
This paper is a simulation based study of cloud assisted multi-user video streaming. It is based upon two use cases (one related to video poker the other related to MOOCs). The paper looks at strategies for placing cloud locations to facilitate streaming using Amazon EC2 cloud locations. The paper compares a strategy that dynamically picks new locations for cloud hosts as time goes on. Interestingly this seems to provide little benefit compared with simply having a good initial choice of sites even when users may drop into and out of a cloud chat session over the course of many hours.
This paper used a likelihood based framework to create a rigorous way to assess models of networks. Network evolution is broken down into an operation model (it decides the 'type' of change to be made to the network, e.g. "add node" "add link" "remove node" "remove link") and an object model (that decides the exact change -- which node/link to add).
The system is shown to be able to recover known parameters on artificial models and to be useful in analysis of real data.
This work can generate graphs from a very large family with the aim of fitting those graph to parameters of real data sets.
Invited talk to the SAAT conference at Bournemouth describes briefly how the FETA model for graph evolution might be used in a streaming environment.
This talk is the latest of my talks about FETA the framework for evolving topology analysis. This uses updated notation. The core of the work is a likelihood based model which can assess how likely it is that observations of the evolution of a graph arise from a particular probabilistic model, for example a model such as the Barabassi-Albert preferential attachment model. Analysis is given to data from Facebook and from Enron as well as from artificial models.
The aim of this paper is to provide a summary and a critique of power law modelling in the internet. Long-range dependence and self-similarity are considered as well as scale-free topology analysis.
This paper creates software models of how P2P network topologies could wire up. It considers the possible strategies such as connecting to close nodes, connecting to random links and so on. Resilience and delay are considered.