A. A. Gebremariam, M. Chowdhury, A. Goldsmith, F. Granelli

Network function virtualization enables multiple user-oriented services to be served on a common physical infrastructure. In this work, we present an implementation of static and dynamic resource allocation schemes and use it to study the frequency of trigger events for resource allocation. Our results suggest that performance benefits (e.g., reduction in dropped packet events) can be had even if dynamic spectrum-level slicing does not happen on time scales similar to that of the local resource schedulers residing in each virtual network.