FreshPatents.com Logo
stats FreshPatents Stats
6 views for this patent on FreshPatents.com
2014: 4 views
2012: 2 views
Updated: October 26 2014
newTOP 200 Companies filing patents this week


    Free Services  

  • MONITOR KEYWORDS
  • Enter keywords & we'll notify you when a new patent matches your request (weekly update).

  • ORGANIZER
  • Save & organize patents so you can view them later.

  • RSS rss
  • Create custom RSS feeds. Track keywords without receiving email.

  • ARCHIVE
  • View the last few months of your Keyword emails.

  • COMPANY DIRECTORY
  • Patents sorted by company.

Follow us on Twitter
twitter icon@FreshPatents

Three dimensional building control system and method

last patentdownload pdfdownload imgimage previewnext patent


20120297346 patent thumbnailZoom

Three dimensional building control system and method


The system helps facility managers and other users to efficiently navigate through a building or complex of buildings, and quickly gather information for (and control) individual building systems or groups of systems. A method includes displaying an image representing at least a portion of a building, wherein at least part of the image is a three-dimensional representation; and displaying a representation of a device associated with the building, wherein the representation of the device is selectable through the user interface.

Browse recent Encelium Holdings, Inc. patents - ,
USPTO Applicaton #: #20120297346 - Class: 715850 (USPTO) - 11/22/12 - Class 715 
Data Processing: Presentation Processing Of Document, Operator Interface Processing, And Screen Saver Display Processing > Operator Interface (e.g., Graphical User Interface) >On-screen Workspace Or Object >Interface Represented By 3d Space >Navigation Within 3d Space

view organizer monitor keywords


The Patent Description & Claims data below is from USPTO Patent Application 20120297346, Three dimensional building control system and method.

last patentpdficondownload pdfimage previewnext patent

CROSS REFERENCE TO RELATED APPLICATION

This Application is a continuation of and claims priority to U.S. patent application Ser. No. 13/108,757, filed May 16, 2011, the contents of which are incorporated herein by reference.

DESCRIPTION

1. Field

The disclosure generally relates to systems and methods for building control.

2. Background

Many buildings include complex systems to provide heating, cooling, lighting, security and other services. These systems may require a significant amount of energy, which can present a significant challenge to facility managers facing rising (and volatile) energy prices, as well as requirements to be more environmentally sensitive. These challenges can be particularly acute for large commercial buildings where manual control over many disparate systems is often time-consuming, burdensome, and expensive.

These and other issues are addressed by embodiments of the present disclosure. Among other things, embodiments of the disclosure can reduce energy costs and help control and manage building systems more efficiently and effectively compared to conventional systems or methods for building control.

BRIEF

SUMMARY

Among other things, embodiments help facility managers and other users to efficiently navigate through a building or complex of buildings, and quickly gather information for (and control) individual building systems or groups of systems.

A method according to various embodiments includes displaying, by a computer-based system via a user interface, an image representing at least a portion of a building, wherein at least part of the image is a three-dimensional representation; and displaying, by the computer-based system via the user interface, a representation of a device associated with the building, wherein the representation of the device is selectable through the user interface.

A system according to various embodiments includes a processor; a user interface coupled to the processor; and a memory coupled to the processor and storing instructions for: displaying, via the user interface, an image representing at least a portion of a building, wherein at least part of the image is a three-dimensional representation; and displaying, via the user interface, a representation of a device associated with the building, wherein the representation of the device is selectable through the user interface.

A computer-readable medium according to various embodiments comprises instructions for: displaying, by a computer-based system via a user interface, an image representing at least a portion of a building, wherein at least part of the image is a three-dimensional representation; and displaying, by the computer-based system via the user interface, a representation of a device associated with the building, wherein the representation of the device is selectable through the user interface.

It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure, as claimed.

BRIEF DESCRIPTION OF THE DRAWINGS

A more complete understanding of the embodiments of the present disclosure may be derived by referring to the detailed description and claims when considered in connection with the following illustrative figures.

FIG. 1 illustrates an exemplary method according to various embodiments.

FIGS. 2 and 3 are exemplary three-dimensional representations of buildings in accordance with various embodiments.

FIG. 3A is an exemplary two-dimensional view of a portion of the building represented in FIG. 3.

FIGS. 4 and 5 illustrate exemplary three-dimensional representations of the floors within a building in accordance with various embodiments.

FIGS. 6A and 6B illustrate exemplary methods according to various embodiments.

FIG. 7 is an exemplary two-dimensional representation of a building floor in accordance with various embodiments.

FIGS. 8A-8M illustrate the generation of exemplary gradient indicators in accordance with various embodiments.

FIG. 9 illustrate the generation of alternate gradient indicators in accordance with various embodiments.

FIG. 10 illustrates an exemplary system in accordance with various embodiments.

DETAILED DESCRIPTION

The detailed description of exemplary embodiments herein makes reference to the accompanying drawings and pictures, which show the exemplary embodiment by way of illustration and its best mode. While these exemplary embodiments are described in sufficient detail to enable those skilled in the art to practice the disclosure, it should be understood that other embodiments may be realized and that logical and mechanical changes may be made without departing from the spirit and scope of the disclosure. Thus, the detailed description herein is presented for purposes of illustration only and not of limitation. For example, the steps recited in any of the method or process descriptions may be executed in any order and are not limited to the order presented. Moreover, any of the functions or steps may be outsourced to or performed by one or more third parties. Furthermore, any reference to singular includes plural embodiments, and any reference to more than one component may include a singular embodiment.

Systems, methods and computer program products are provided. In the detailed description herein, references to “one embodiment”, “an embodiment”, “an example embodiment”, etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described. After reading the description, it will be apparent to one skilled in the relevant art(s) how to implement the disclosure in alternative embodiments.

In various embodiments, the methods described herein are implemented using the various particular machines described herein. The methods described herein may be implemented using the below particular machines, and those hereinafter developed, in any suitable combination, as would be appreciated immediately by one skilled in the art. Further, as is unambiguous from this disclosure, the methods described herein may result in various transformations of certain articles. The disclosure may be implemented as a method, system or in a computer readable medium.

Three Dimensional Display

FIG. 1 illustrates an exemplary method for building control and management that provides three-dimensional views of a building or complex according to an embodiment. This method, as well as the other methods described herein, may be performed by one or more computer-based systems (i.e., any system that includes, or operates in conjunction with, a computer system), such as the exemplary computer-based system depicted in FIG. 10 and described in more detail below. All, or a portion, of the steps in method 100 may be performed in any suitable order.

The exemplary method 100 includes displaying, (e.g., via a user-interface including a display screen), an image representing at least a portion of a building, where at least part of the image is a three-dimensional representation (110). The method 100 further includes displaying a representation of a device via the user interface (120), displaying information for the device (130), and facilitating the configuration of the device (140). Method 100 also includes adjusting the configuration of the image (150), receiving a selection of a portion of the represented building (160), and providing an animation in accordance with the selection of the building portion (170).

FIGS. 2 and 3 illustrate exemplary three-dimensional representations of buildings that may be displayed (110) in accordance with embodiments. Any desired portion of a structure (or group of structures) may be displayed in accordance with embodiments. Additionally, representations of any desired geographical features, landmarks, or other objects may be displayed in conjunction with the representation of the building(s). The image 200 in FIG. 2, for example, includes a three-dimensional representation of an office building 210, as well as representations of streets 220 and 230 external to the building.

Portions of the image may be colored, shaded, transparent, opaque, or include any other desired characteristic in order to convey information to a user. In FIG. 200, for example, the representation of building 210 includes a semi-transparent exterior 240, allowing representations of the floors 250 within the building 210 to be seen. The image 300 in FIG. 3 includes a three-dimensional representation of a coliseum 310. In this embodiment, a semi-transparent ground layer representation 320 helps distinguish between levels at or above ground level (330) and levels below ground level (340).

While FIGS. 2 and 3 display entire structures, any desired portion of a building may be displayed in accordance with embodiments. Additionally, embodiments may alternately display buildings (or portions thereof) in two-dimensional or three-dimensional representations, or a combination of the two. Among other things, embodiments provide a user with the flexibility to quickly and easily navigate through a building to gather information and/or control various systems and other features of the building. FIG. 3A, for example, depicts a two-dimensional view of a portion of the coliseum 310 shown in FIG. 3. In one embodiment, the image in FIG. 3A can be displayed in response to a user selecting the portion of the coliseum 310 via the user interface (160).

An image displayed in accordance with embodiments may include any desired features of a building, such as interior and exterior walls, cubicles, offices, light fixtures, HVAC ventilation systems, plumbing, doorways, stairways, furniture, repair equipment, and any other static or dynamic object or feature. For example, the image in FIG. 3A includes representations of features such as walls, hallways, and offices 370, and displays a representation of a device 375 (a lighting control device in this example) (120) in each respective office 370. The shading in the representation of office 380 signifies that it has been selected by a user (e.g., by using a control device of a user interface), which can allow the user to gather information on the office 380, update status information for the office 380, or perform any other desired operation. The representation of device 385 may likewise be selected by a user in order to, for example, display information for the device (130) and/or to configure the device 385 (140). The user may select a representation of a device or portion of a building in any suitable manner, including by using a control device of the user interface (such as a mouse or keyboard) to click, highlight, and/or hover over the desired representation.

The user may also control various features and functionality of the device through the user interface according to embodiments. In one exemplary embodiment, a user may select the representation of lighting device 385 in order to turn the lights on or off in office 380, or to set a timer to turn the lights in office 380 on or off at a predetermined time. In another exemplary embodiment, a user can alternately hide or show a feature, group of features, or class of features via the user interface. Among other things, this allows a user to select the level of detail he/she wishes to view in the image, as well as to reduce unwanted clutter in an image. FIG. 7 illustrates an exemplary two-dimensional image of a floor selected by a user. In this example, image 700 includes a menu control 710 to allow a user to display or hide various features on the floor, as well as to configure devices for the floor, such as sensors.

Embodiments can display information about any desired devices, systems, or portions of a building (130). For example, embodiments may be used to display information regarding parameters such as a lighting status; a lighting level; a load shedding status; a lighting power density; a lighting power consumption; an occupancy status; a comparative energy trend; a temperature level; a humidity level; a coverage range of a sensor; and/or a carbon dioxide level. Information on such parameters may be displayed in any desired format. In one embodiment, described in more detail below, information regarding parameters is graphically displayed using gradient indicators. In one embodiment, parameters may be associated with a device and/or a zone of the building. Information (such as a parameter) related to a device can be displayed in response to a selection of a representation of the device by a user. Device information may be presented in any other suitable manner. In one embodiment, for example, information for a device is displayed, and periodically updated next to the representation of the device in the display image to allow a user to quickly see the status of a group of devices.

Embodiments may allow a user to configure (140) or perform other functions using a representation of a device in any suitable manner. For example, image 300 includes buttons 350 for configuring, operating, and analyzing various devices and groups of devices in the coliseum. Embodiments may also allow a user to perform diagnostics on any desired system in a building.

Method 100 allows a user to adjust the properties of an image (150). In FIG. 3, for example, menu bar 360 allows a user to configure the image 300, as well as to select portions thereof (160). In one embodiment, the menu bar 360 allows a user to alternate between two-dimensional and three-dimensional representations, tilt the image, rotate the image, and to increase or decrease the scale of the image. In one embodiment, a user can add, delete, and manipulate the elements in an image, such as to alter the shape of a lighting zone, add or remove a cubicle or internal wall, add labeling to portions of the image, and/or perform any other desired modifications to the image. In another embodiment, elements in an image can be altered or updated automatically in response to an event. For example, if an occupant of a building turns on lights in the building, the image can be automatically updated to show which lights are on and/or the areas or zones illuminated by the active lights (see, e.g., the display of gradient indicators below).

Embodiments may allow a user to define zones of interest within the image. Such zones may be of any size, shape, and volume. A zone may be any volume inside, outside, or associated with the building in any way. Zones may be defined according to boundaries (such as internal walls of the building), in accordance with one or more devices (such as the zone of illumination of a lighting device), and/or arbitrarily defined according to a desired criteria (such as a zone that includes the offices of the human resources department on the floor of a building).

Embodiments can display information for a building in a manner that is not only efficient and effective, but visually pleasing to users. In one exemplary embodiment, referring now to FIG. 4, a plurality of floors (410, 420, 430, and 440) are displayed in image 400. As shown, floors 410, 420, 430, and 440 each have vanishing points in their x-axis and y-axis, but no vanishing point in their z-axis. Instead, the floors 410, 420, 430, and 440 are arranged in a stack by offsetting the y-coordinate of adjacent floors (Le., floor 410 appears further towards the rear of the image 400 than floor 420). Among other things, by not diminishing the size of floors due to a z-axis vanishing point, multiple floors of a building (and information associated therewith) can be displayed in full, which provides more information in a visually pleasing manner to a user. Additionally, the vertical (z-axis) spacing between floor representations 410, 420, 430, and 440 is larger than the spacing between the floors of the actual building. Furthermore, the front boundary line of each floor 410, 420, 430, 440 is of substantially equal thickness to the rear boundary line of the floor. Among other things, each of these features helps a user view each floor and its respective detail in its entirety.

While each floor in FIG. 4 is depicted in a flat plane, alternate embodiments may (with additional z-axis layering) depict some or all of the elements of a three-dimensional representation using a vanishing point in the z-axis. For example, it is possible to depict the position of sun shades on a building by showing them “upright” as rectangles. In one embodiment, complex z-ordering can be (at least partially) avoided by determining the distance between the element and the viewpoint of the user, and by drawing distant elements first.

An animation may be presented in response to the selection of a portion of a building (170). In one embodiment, referring now to FIG. 5, image 500 includes representations of four floors (510, 520, 530, and 540). An indicator identifying the floors is displayed next to each floor (515, 525, 535, and 545, respectively). A user may select any of floors 510, 520, 530 or 540, or use the floor selection control 550 to quickly display floors in other parts of the building. In one embodiment, where the user is working on floor 540 and selects floor 520, an animation is presented to drop floor 540 to the bottom of the displayed floors (as shown in FIG. 5) and present floor 520 in the middle of the image 500. In this embodiment, the image is presented at a lower level of detail as the floors are moving during the animation, and presents the image at a higher level of detail once the animation is complete and the floors are stationary in the image. Similar animations may be performed when tilting, rotating, panning, and zooming the image. Where the image is generated in layers (as described below), selected layers of the image may be presented at a higher or lower level of detail during animation. Additionally, as shown in FIG. 5, non-selected portions of a building (such as floors 510 and 540) may be shown in lower-levels of detail from the selected portion(s) (i.e., floor 520 in FIG. 5). Among other things, this reduces the amount of graphical rendering required by a computer system displaying the image, and allows a user to quickly identify the selected floor or other selected portion of a building.

Embodiments may generate and display a three-dimensional representation of a building, building feature, or portions thereof in any suitable manner. In one exemplary embodiment, displaying a three-dimensional representation of a building feature includes identifying a visible feature of the building to display in the three-dimensional representation, and identifying an invisible feature of the building to omit from the three-dimensional representation. Determining visible and invisible features may be performed in any suitable manner, such as by analyzing a feature along its z-axis, identifying a class to which the feature belongs to (e.g., a wall, furniture, a stairwell), and/or analyzing the feature\'s position relative to other nearby features. Visible features can be displayed in the three-dimensional representation in layers, each layer corresponding to a class of features for the building. For example, the boundary of a building floor may correspond to one layer, while walls and furniture correspond to another layer. A more detailed exemplary method for generating a three-dimensional representation is described below.

Exemplary Generation of a 3D Image

The following description illustrates one possible method for generating a three-dimensional representation. In this embodiment, generation of the three-dimensional representation includes an analysis of the area to be viewed, the scale of the image, and the rotation and translation of the image. Additionally, a floor plan (such as reflected ceiling plan or CAD drawing) for the portion of the building being rendered is analyzed and parsed into tiles of a predetermined size (e.g., 25′×25′). Among other things, the division of the image into tiles helps improve performance of a computer system displaying the image by, as described more below, determining in advance if the tile or portions thereof are even visible. Calculations for invisible tiles and portions thereof can thus be avoided.

Features within each tile (which may be any visible object such as walls, doors, furniture, stairways, and other objects) are sorted to determine features that will be invisible. Features may be omitted according to any desired criteria. For example, a furniture layer may be omitted from calculations and from drawing onto the screen if performance would suffer. The floor\'s outline and structural wall, however, may be assigned a higher priority and they may therefore still be drawn. Among other things, this exemplary approach helps balance detail on the screen with performance demands, as well as with aesthetics.

A transformation matrix may be calculated by analyzing the scale, rotation and translation of the image. In this exemplary embodiment, for example, for an image displaying multiple floors of a building, a determination is made as to whether the floor, (or portions thereof) are visible and if so, which tiles of the floor are visible. A determination is made as to where on the display screen of a user interface the three-dimensional points of the floor would be projected. This determination is performed by setting the z-axis coordinate for the floor to zero, applying a transformation matrix, projecting the coordinate into the two-dimensional coordinates of the user interface display screen, and adjusting the y-axis screen coordinates of the point up or down (Le., parallel shifting the coordinates on the screen) by adding an offset proportional the floor\'s z-coordinate. In this embodiment, the z-coordinate is removed from each point, thereby positioning the user\'s viewpoint at the same relative coordinate for each floor, and shifting the resulting floor parallel to the y-axis. This approach removes the vanishing point in the z-axis, which, as described above, helps users to see the full detail of multiple floors.

In this exemplary embodiment, the potentially visible area of a floor is analyzed. If the analysis indicates that no portion of the potentially visible area is actually visible, no further processing for the floor is performed. Otherwise, the visible area of the floor is subdivided (e.g., the rectangle created by the corner points of the floor is subdivided into four quadrants) and a determination is made as to whether any portion of each quadrant is visible. The visible area is repeatedly subdivided in like fashion until it is known which tiles of the floor are visible, yielding a list of full or partially visible tiles for the floor.



Download full PDF for full patent description/claims.

Advertise on FreshPatents.com - Rates & Info


You can also Monitor Keywords and Search for tracking patents relating to this Three dimensional building control system and method patent application.
###
monitor keywords



Keyword Monitor How KEYWORD MONITOR works... a FREE service from FreshPatents
1. Sign up (takes 30 seconds). 2. Fill in the keywords to be monitored.
3. Each week you receive an email with patent applications related to your keywords.  
Start now! - Receive info on patent apps like Three dimensional building control system and method or other areas of interest.
###


Previous Patent Application:
Three-dimensional animation for providing access to applications
Next Patent Application:
Control of a device using gestures
Industry Class:
Data processing: presentation processing of document
Thank you for viewing the Three dimensional building control system and method patent info.
- - - Apple patents, Boeing patents, Google patents, IBM patents, Jabil patents, Coca Cola patents, Motorola patents

Results in 1.07763 seconds


Other interesting Freshpatents.com categories:
Computers:  Graphics I/O Processors Dyn. Storage Static Storage Printers

###

Data source: patent applications published in the public domain by the United States Patent and Trademark Office (USPTO). Information published here is for research/educational purposes only. FreshPatents is not affiliated with the USPTO, assignee companies, inventors, law firms or other assignees. Patent applications, documents and images may contain trademarks of the respective companies/authors. FreshPatents is not responsible for the accuracy, validity or otherwise contents of these public document patent application filings. When possible a complete PDF is provided, however, in some cases the presented document/images is an abstract or sampling of the full patent application for display purposes. FreshPatents.com Terms/Support
-g2-0.7709
     SHARE
  
           


stats Patent Info
Application #
US 20120297346 A1
Publish Date
11/22/2012
Document #
13108897
File Date
05/16/2011
USPTO Class
715850
Other USPTO Classes
International Class
06F3/048
Drawings
16



Follow us on Twitter
twitter icon@FreshPatents