CROSS REFERENCE TO RELATED APPLICATIONS
This application claims the benefit of U.S. Provisional Patent Application Ser. Nos. 61/507,983 and 61/556,945 filed on Jul. 14, 2011 and Nov. 8, 2011. The disclosures of the provisional patent applications are hereby incorporated by reference for all purposes.
With the proliferation of computing and networking technologies, two aspects of computing devices have become prevalent: non-traditional (e.g., mouse and keyboard) input mechanisms and smaller form factors. User interfaces for all kinds of software applications have been designed taking typical screen sizes and input mechanisms into account. Thus, user interactions in conventional systems are presumed to be through keyboard and mouse type input devices and a minimum screen size that enables users to interact with the user interface at a particular precision.
Limited display real estate burdens many portable devices from providing full featured content management functionality. Furthermore, gestural commanding is not efficient using conventional menus including support for limited displays or just taking into account where the user's finger/mouse/pen is. Additionally, display devices such as projectors, monitors, and televisions may lack controls for providing content management functionality. Modern software solutions such as on screen keyboards may be awkward to type and encompass valuable display area. Lack of adequate software solutions for managing content on non-traditional devices largely limit device use to content consumption. Carrying multiple devices for content management and consumption defeats portability and unnecessarily takes away from an enriching singular source for content consumption and management.
Limited screen space in mobile devices presents a significant challenge to delivering effective control interfaces. For example, in conventional systems, color choices are provided through multi step menu controls to enable a user to adjust various facets of color selections. Similarly, in conventional systems, users are enabled to alter color of graphics, shapes, objects, etc., through complex menu structures providing extensive functionality to modify many attributes such as shading and light effects. However, screen size limitations and lack of input options can force designers of mobile systems to provide simplified, as well as less natural/cumbersome, controls for color functionality. It may be very hard to select a specific color, for example, because the color palette is too small on a tablet/slate device. Such limited solutions can fail to reproduce coloring features provided by conventional counterparts.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to exclusively identify key features or essential features of the claimed subject matter, nor is it intended as an aid in determining the scope of the claimed subject matter.
Embodiments are directed to managing content color through context based color menu. Context based color menus may be deployed for variety of scenarios for content color management. An application according to embodiments may present the context based color menu upon activation of the menu by a user input. The input may include a tap, a swipe, a keyboard, a mouse, a voice, a visual, a pen, and/or a gesture action. Next, the application may assign a color to selected content according to another detected input.
These and other features and advantages will be apparent from a reading of the following detailed description and a review of the associated drawings. It is to be understood that both the foregoing general description and the following detailed description are explanatory and do not restrict aspects as claimed.
BRIEF DESCRIPTION OF DRAWINGS
FIGS. 1A and 1B illustrate some example devices, where context based color menus may be employed;
FIG. 2 illustrates an example context based color menu with controls to manage content color according to embodiments;
FIG. 3 illustrates another example context based color menu with content type controls according to embodiments;
FIG. 4 illustrates an example scenario of applying a color to a content type using a context based color menu according to embodiments;
FIG. 5 illustrates an example context based color menu with additional sub-level color controls integrated into the menu according to embodiments;
FIG. 6 illustrates an example context based color menu with color indicators according to embodiments;
FIG. 7 illustrates other examples of context based color menus according to embodiments;
FIG. 8 is a networked environment, where a system according to embodiments may be implemented;
FIG. 9 is a block diagram of an example computing operating environment, where embodiments may be implemented; and
FIG. 10 illustrates a logic flow diagram for a process of managing content color through context based color menu in touch and gesture enabled devices according to embodiments.
As briefly described above, a user interface of an application executing on a device may present a context based color menu in relation to displayed content in response to an activation of the menu. The context based color menu may provide controls to manage content. Next, the application may detect another input activating a control within the context based color menu. The input may include touch, gesture, keyboard entry, mouse click, and/or pen input. The application may execute a command associated with the input to assign a color to selected content.
In the following detailed description, references are made to the accompanying drawings that form a part hereof, and in which are shown by way of illustrations specific embodiments or examples. These aspects may be combined, other aspects may be utilized, and structural changes may be made without departing from the spirit or scope of the present disclosure. The following detailed description is therefore not to be taken in the limiting sense, and the scope of the present invention is defined by the appended claims and their equivalents. While the embodiments will be described in the general context of program modules that execute in conjunction with an application program that runs on an operating system on a personal computer, those skilled in the art will recognize that aspects may also be implemented in combination with other program modules.
Generally, program modules include routines, programs, components, data structures, and other types of structures that perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that embodiments may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and comparable computing devices. Embodiments may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
Embodiments may be implemented as a computer-implemented process (method), a computing system, or as an article of manufacture, such as a computer program product or computer readable media. The computer program product may be a computer storage medium readable by a computer system and encoding a computer program that comprises instructions for causing a computer or computing system to perform example process(es). The computer-readable storage medium is a computer-readable memory device. The computer-readable storage medium can for example be implemented via one or more of a volatile computer memory, a non-volatile memory, a hard drive, a flash drive, a floppy disk, or a compact disk, and comparable media.
According to embodiments, a user interface of a touch-enabled or gesture-enabled device may employ context based color menus to manage content such as textual content, shape, object, line, 3D effect, graphics, tables, etc. A context based color menu may make use of features specific to touch or gesture enabled computing devices, but may also work with a traditional mouse and keyboard. Context based color menu may be an example of a context based menu. Context based menus, in general, may be used to provide quick access to commonly used commands while viewing or editing displayed content such as documents, emails, contact lists, other communications, or any content (e.g., audio, video, etc.). Context based menus may appear as part of a user interface's regular menu, in a separate viewing pane (e.g., a window) outside or inside the user interface, and so on. Typically, context based menus present a limited set of commands for easy user access, but additional submenus may be presented upon user selection. Commonly used context based menus may appear over the viewed document. A tap or swipe action as used herein may be provided by a user through a finger, a pen, a mouse, or similar device, as well as through predefined keyboard entry combinations or a voice command.
FIGS. 1A and 1B illustrate some example devices, where a context based color menus may be employed. As touch and gesture based technologies are proliferating and computing devices employing those technologies are becoming common, user interface arrangement becomes a challenge. Touch and/or gesture enabled devices, specifically portable devices, tend to have smaller screen size, which means less available space for user interfaces. For example, in a user interface that enables editing of a document (text and/or graphics), in addition to the presented portion of the document, a virtual keyboard may have to be displayed further limiting the available space (“real estate”). Thus, in such scenarios, providing a full control menu may be impractical or impossible. Embodiments are directed to a context based color menus to assign color to content including text, shapes, lines, objects, etc.
As mentioned above, smaller available display space, larger content, and different aspect ratios make conventional menus impractical. Existing touch-based devices such as tablet PCs and similar ones are typically directed to data consumption (i.e., viewing). On the other hand, commonly used applications such as word processing applications, spreadsheet applications, presentation applications, and comparable ones are directed to creation (generating and editing documents with textual, graphical, and other content). Currently available context based menus are either invisible most of the time or they block the content when they are visible. A context based color menu according to some embodiments may be provided dynamically based on presented content and available space while providing ease of use without usurping much needed display area.
Referring to FIGS. 1A and 1B, some example devices are illustrated, where a context based color menu may be provided according to embodiments. A context based color menu may be an embodiment of a context based menu. Embodiments may be implemented in touch and/or gesture enabled devices or others with keyboard/mouse/pen input, with varying form factors and capabilities.
Device 104 in FIG. 1A is an example of a large size display device, where a user interface may be provided on screen 106. Functionality of various applications may be controlled through hardware controls 108 and/or soft controls such as a context based color menu displayed on screen 106. A user may be enabled to interact with the user interface through touch actions or gestures (detected by a video capture device). A launcher indicator may be presented at a fixed location or at a dynamically adjustable location for the user to activate the context based color menu. Examples of device 104 may include public information display units, large size computer monitors, and so on.
Device 112 in FIG. 1A is an example for use of a context based color menu to control functionality. A user interface may be displayed on a screen or projected on a surface and actions of user 110 may be detected as gestures through video capture device 114. The user's gestures may activate a context based color menu to assign color to content as displayed on the device 112.
FIG. 1B includes several example devices such as touch enabled computer monitor 116, laptop computer 118, handheld computer 124, smart phone 126, tablet computer (or slate) 128, and mobile computing device 132, which may be used for computing, communication, control, measurement, and a number of other purposes. The example devices in FIG. 1B are shown with touch activation 120. However, any of these and other example devices may also employ gesture enabled activation of context based color menus to assign color to content. In addition, tools such as pen 130 may be used to provide touch input. A context based color menu may be controlled also through conventional methods such as a mouse input or input through a keyboard 122.
FIG. 2 illustrates an example context based color menu with controls to manage content color according to embodiments. Diagram 200 displays a context based color menu 202 with controls to manage a color of content including textual content and shapes.
According to some embodiments, a user interface of an application may display context based color menu 202 with navigation control 210 to access a prior context based menu. The prior context based menu may be a top level color menu providing color controls for a wider range of colors. If the menu 202 is a top level color menu, then the navigation control 210 may provide access to a context based menu with controls to select type of content to assign the color such as textual content, shapes, etc. Alternatively, the navigation control 210 may be used to change the context based color menu 202 to a collapsed state (i.e., visually minimized).
According to other embodiments, the menu 202 may have a set of color controls positioned radially adjacent to each other within the menu 202. The radial menu 202 is just an example embodiment. A context based color menu may take many forms and shapes including a linear shape menu, a half circular shape menu, an arc shape menu, etc. A color control 206 may also take multiple forms including color controls displaying a single color or a continuous color control to select a color from a spectrum of color.
According to yet other embodiments, the menu 202 may display sub-menu launcher 204 adjacent to color controls to launch sub-menus associated with the color controls. An example sub-menu launcher 204 may launch a sub-menu of sub colors associated with the adjacent color control. An example sub-menu may include color controls for shades of a color provided by the color control in the menu 202. Once the application may detect an input activating a sub-menu launcher 204, the application may present a sub-menu including a set of sub colors associated with the color control adjacent to the sub-menu launcher. In response to detecting another input selecting one of the sub color controls, the application may assign a sub color associated with the selected sub color control to the content. If there is a sub-menu, the color last selected in that submenu may bubble up to the top level menu so that the user has quick access back to it (without having to go pick the specific hue, e.g., again). Thus, most recently used (MRU) or most frequently used (MFU) colors may be pushed to the top level menu for ease of access.
According to further embodiments, the application may detect an input 212 selecting a color control 206 or hovering over the color control 206. The application may display an indicator 208 in response to the input to notify the user of the selected color. The indicator 208 may enable the user to more easily notice the selected color if the user action blocks the color control 206 from view. The indicator 208 may display the selected color through the control 206 and draw attention to the assigned or to be assigned color by flashing or through other animation.
FIG. 3 illustrates another example context based color menu with content type controls according to embodiments. Diagram 300 displays example context based color menu 302 executing a variety of commands corresponding to user input.
As previously stated, an input may include a touch action, a gesture action, a keyboard input, a mouse input, or a pen input. The content may include textual content, a shape, a shading, a 3D effect, a line, a fill, a highlighter, a cell color, etc. The user may assign a color to a selected content, a single content type from a set of content types, or multiple content types from the set.
According to some embodiments, the application may position a set of color controls within a half section of the context based color menu 302. A user may choose to activate a color control 306 to assign a color to content. The application may display an indicator 308 on a color control to indicate current or recent input on the color control. As stated in relation to menu 202, a navigation control 310 may be used to navigate to a prior context based menu which may include top level color menu or a menu providing other functionality to manage content such as content type selection or general functions such as copy and paste.
According to other embodiments, the application may position a set of content type controls radially within the other half section of the context based color menu. Example content type controls may include a textual control 304 to assign a selected color to selected text within the content. Another example content type control may include fill control 312 to assign a selected color to selected object within the content. Yet, another example content type control may include a highlight control 314 to highlight the content with the selected color. Furthermore, another example content type control may include a shading control 316 to assign a selected color to selected shading within the content.
According to yet other embodiments, the application may detect an input activating a color control 306. The application may detect another input selecting a content type control 312 from the set content type controls. Next, the application may assign a color associated with the selected color control 306 to the content associated with the content type control 312 to fill a selected object within the content. Selecting different types of content (text, shape, line, etc.) may expose an entirely different palette. For example, for text colors one might want fairly dark colors since they are on a white background. However, for a highlighter one may want fairly light (or bright) colors since the text under the highlighter needs to be able to be visible.
FIG. 4 illustrates an example scenario of applying a color to a content type using a context based color menu according to embodiments. Diagram 400 displays example context based color menu 402 executing variety of commands associated with user input.
According to some embodiments, the menu 402 may have a single content type control 404. The application may position a set of color controls radially within the menu 402 adjacent to the content type control 404. The application may determine an input activating the content type control 404 and launch a sub-menu of content type controls 406. The application may present the sub-menu of content type controls and detect another input selecting one of the content type controls. In response to the other input, the application may assign a selected color from menu 402 to the content associated with the selected content type control.
In some embodiments, content types (text, highlight, shape, line, etc.) may be presented at a top level menu, and clicking on them may expose a different color menu for each type. Depending on the complexity of the color scheme, multiple levels of color selection sub-menus (hierarchical) may be provided for each content type. In other embodiments, the menu may include an item (like control 404) that can be tapped to enable a user to toggle between the options that are currently shown in the submenu. This may allow the user to quickly switch between the different types.
FIG. 5 illustrates an example context based color menu with additional sub-level color controls integrated into the menu according to embodiments. Diagram 500 displays context based color menu 502 integrating multiple levels of color controls into one menu.
According to some embodiments, the menu 502 may have a set of color controls positioned radially within an outside region of the menu 502. An input 516 such as a tap action may select a color control 506 to assign a color to the selected content. The application may display an indicator 508 to show the current selection of the color control 506.
According to other embodiments, the application may position a set of additional sub-level color controls 510 radially inside the set of color controls within the menu 502. The application may detect another input selecting one of the additional sub-level color controls. In response to detecting the other input, the application may replace the set of color controls with another set of color controls associated with the selected sub-level color control 512 in menu 502. The other set of color controls may be shades of the sub-level color control 512. The application may also display another indicator 514 to inform the user of the currently selected sub-level color control. Alternatively, the application may replace the outside set of color controls with the additional sub-level color controls. Additionally, if the set of additional sub-level controls may have another level of sub-level color controls then the application may place the other level of sub-level color controls within the inside color control set of menu 502. If no other level of controls exist then the application may remove the inside set of color controls.
FIG. 6 illustrates an example context based color menu with color indicators according to embodiments. Diagram 600 displays context based color menu 602 illustrating indicators to inform a user of a state of the prior or currently assigned color to the content.
According to some embodiments, a user interface of an application may display an indicator 608 as described in above menu embodiments to inform a user about a selected color control 610. An indicator 608 may enable a user to better view a selected color control 610 particularly in a scenario where the input blocks the user from viewing the selected color control 610. An input 612 such as a tap action may select a color control 610 in menu 602 to assign a color associated with the control 610 to the content. In response to the input 612, the application may position the indicator 608 on an edge of the selected control 610. The edge of color control 610 may be determined by the application to locate the longest edge of the control 610 if the control 610 has a trapezoid shape such as a square, a rectangle, etc. Overlaying the indicator 608 on the longest edge of the color control 610 may make it easier to view the selected color control 610 by providing the user with large visual indicator to represent the selected color. Alternatively, the application may overlay an indicator over the entire edge of the color control. An example may include a circular color control with an indicator around the diameter. The application may also execute an animation over the indicator 608 such as flashing animation to remind the user about to selected color control 610 or assigned color. In addition, the application may activate an indicator 608 in response to an input 612 such as a hover over the color control 610.
According to other embodiments, the application may display a tooltip 604 showing the color in response to the user action selecting the color control. The application may fill the tooltip with the color associated with the color control 610 to preview to the user the color to be assigned to the content or remind the user of the recently assigned color to the content. In addition to filling the tooltip with the color, a name for the color may also be displayed in the tooltip. This may help, for instance, people who are color blind or who have other visual impairments. Additionally, the application may display a drop shaped indicator 606 showing the color positioned adjacent to a location of the user action. Similar to the tooltip 604, the application may fill the drop shaped indicator 606 with a color associated with the color control 610. Similar to indicator 608 and tooltip 604, the drop shaped indicator 606 may enable the user to better view the selected or assigned color especially if the control color 610 is blocked from view due to the input 612.
FIG. 7 illustrates other examples of context based color menus according to embodiments. Diagram 700 displays multiple examples of alternative menus to assign color to content using a context based menu.
According to some embodiments, context based color menu 702 may provide a multi-layered color control sets positioned radially around a set of content management controls along with top level color controls sharing an inside control area of the menu 702. The menu 702 may have a navigation control 710 to navigate to a previous context based menu such as a top level color menu as discussed before. The menu 702 may also have a textual control 704 to assign a selected color to textual content. The menu 702 may also have highlight control 708 to highlight a selected object in the content with the selected color. In addition, the menu 702 may have top level color control 706 that may change the color controls of the menu 702 according to the shades of the top level color control 706.
According to other embodiments, the menu 712 may provide color controls through use of sub-menus. In response to an input activating a sub-menu launcher control, the menu 712 may display a sub-menu 714 showing a continuous color control. The user may use control 716 to select a color to assign to the content. The control 716 may be swiped clockwise and counterclockwise to locate a preferred color. Embodiments are not limited to the above examples. The menu 712 may have other controls to assign color based on content type. The menu 712 may also have a navigation control to navigate to previous context based menu similar to above described embodiments.
According to yet other embodiments, the menu 718 may provide a continuous color control. The user may select a color for the content by choosing a point on the menu\'s color spectrum control. The menu may also have other controls to assign color based on content type. The menu may also have a navigation control to navigate to prior context based menu as described above. Embodiments are not limited to a color control with circular spectrum, any spectrum of color may be used to provide a spectrum based color control to the user including a circular, a rectangular, and other forms of spectrum.
Color control menu 720 illustrates another example embodiment. Similarly constructed to the menu 718, in menu 720, circles 722 indicate colors currently used on the content that is being edited. Instead of circles, any other form of indication may be used to provide a user feedback on currently used colors.
According to another embodiment, the application may provide top level controls within a sub-menu providing sub color controls. The application may insert the top level controls from a context based color menu in a sub-menu providing a set of sub color controls associated with a selected top level control. The application may also position another indicator about the assigned color on a corresponding color control within the set of top level controls in the sub-menu.
According to other embodiments, the application may combine a set of color controls in a context based color menu with a set of font controls to produce a set of style controls. The application may produce the style controls by overlaying one or more font controls from the set of font controls into every color control from the set of color controls. The application may apply a style associated with a selected style control to the content in response to detecting another user action activating one of the style controls.
The example commands, links, submenus, configurations, and context based color menus depicted in FIGS. 1 through 7 are provided for illustration purposes only. Embodiments are not limited to the shapes, forms, and content shown in the example diagrams, and may be implemented using other textual, graphical, and similar schemes employing the principles described herein.
FIG. 8 is a networked environment, where a system according to embodiments may be implemented. In addition to locally installed applications, such as application 922 discussed below, a context based color menu for touch and/or gesture enabled devices may be also be employed in conjunction with hosted applications and services that may be implemented via software executed over one or more servers 806 or individual server 808. A hosted service or application may communicate with client applications on individual computing devices such as a handheld computer 801, a desktop computer 802, a laptop computer 803, a smart phone 804, a tablet computer (or slate), 805 (‘client devices’) through network(s) 810 and control a user interface presented to users.
As previously discussed, a context based color menu may be used to assign color to content provided by the hosted service or application. For example, a browser application, a word processing application, a spreadsheet application, a calendar application, a note taking application, a graphics application, and comparable ones may make use of a context based color menu according to embodiments. The context based color menu may be activated through a variety of user actions such as selection of content, activation of a launcher indicator, detection of a predetermined touch or gesture action, etc.
Client devices 801-805 are used to access the functionality provided by the hosted service or application. One or more of the servers 806 or server 808 may be used to provide a variety of services as discussed above. Relevant data may be stored in one or more data stores (e.g. data store 814), which may be managed by any one of the servers 806 or by database server 812.
Network(s) 810 may comprise any topology of servers, clients, Internet service providers, and communication media. A system according to embodiments may have a static or dynamic topology. Network(s) 810 may include a secure network such as an enterprise network, an unsecure network such as a wireless open network, or the Internet. Network(s) 810 may also coordinate communication over other networks such as PSTN or cellular networks. Network(s) 810 provides communication between the nodes described herein. By way of example, and not limitation, network(s) 810 may include wireless media such as acoustic, RF, infrared and other wireless media.
Many other configurations of computing devices, applications, data sources, and data distribution systems may be employed to manage content color through context based color menu. Furthermore, the networked environments discussed in FIG. 8 are for illustration purposes only. Embodiments are not limited to the example applications, modules, or processes.
FIG. 9 and the associated discussion are intended to provide a brief, general description of a suitable computing environment in which embodiments may be implemented. With reference to FIG. 9, a block diagram of an example computing operating environment according to embodiments is illustrated, such as computing device 900. In a basic configuration, computing device 900 may be any device in stationary, mobile, or other form such as the example devices discussed in conjunction with FIGS. 1A, 1B, and 8, and include at least one processing unit 902 and system memory 904. Computing device 900 may also include a plurality of processing units that cooperate in executing programs. Depending on the exact configuration and type of computing device, the system memory 904 may be volatile (such as RAM), non-volatile (such as ROM, flash memory, etc.) or some combination of the two. System memory 904 typically includes an operating system 905 suitable for controlling the operation of the platform, such as the WINDOWS ®, WINDOWS MOBILE®, or WINDOWS PHONE® operating systems from MICROSOFT CORPORATION of Redmond, Wash. The system memory 904 may also include one or more software applications such as program modules 906, application 922, context based color menu module 924, and detection module 926.
Context based color menu module 924 may operate in conjunction with the operating system 905 or application 922 and provide a context based color menu as discussed previously. Context based color menu module 924 may also provide commands, links, and submenus to manage color of content. Detection module 926 may detect a user input and execute a command associated with the input to assign color to selected content. This basic configuration is illustrated in FIG. 9 by those components within dashed line 908.
Computing device 900 may have additional features or functionality. For example, the computing device 900 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Such additional storage is illustrated in FIG. 9 by removable storage 909 and non-removable storage 910. Computer readable storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. System memory 904, removable storage 909 and non-removable storage 910 are all examples of computer readable storage media. Computer readable storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computing device 900. Any such computer readable storage media may be part of computing device 900. Computing device 900 may also have input device(s) 912 such as keyboard, mouse, pen, voice input device, touch input device, an optical capture device for detecting gestures, and comparable input devices. Output device(s) 914 such as a display, speakers, printer, and other types of output devices may also be included. These devices are well known in the art and need not be discussed at length here.
Computing device 900 may also contain communication connections 916 that allow the device to communicate with other devices 918, such as over a wireless network in a distributed computing environment, a satellite link, a cellular link, and comparable mechanisms. Other devices 918 may include computer device(s) that execute communication applications, other directory or policy servers, and comparable devices. Communication connection(s) 916 is one example of communication media. Communication media can include therein computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media.
Example embodiments also include methods. These methods can be implemented in any number of ways, including the structures described in this document. One such way is by machine operations, of devices of the type described in this document.
Another optional way is for one or more of the individual operations of the methods to be performed in conjunction with one or more human operators performing some. These human operators need not be collocated with each other, but each can be only with a machine that performs a portion of the program.
FIG. 10 illustrates a logic flow diagram for a process of managing content color through context based color menu in touch and/or gesture enabled devices according to embodiments. Process 1000 may be implemented as part of an application or an operating system of any computing device capable of accepting touch, gesture, keyboard, mouse, pen, or similar inputs.
Process 1000 begins with operation 1010, where, in response to a user input, a context based color menu may be presented by a user interface of an application displaying the context based color menu on a device screen to manage content. The context based color menu may position color controls in a variety of forms and may include content type controls integrated into the menu. Next, the user interface may detect another user input on the context based color menu at operation 1020. The input may be a tap, a swipe, a press and hold, or similar user action including, but not limited to touch, gestures, keyboard entries, mouse clicks, pen inputs, optically captured gestures, voice commands, etc. At operation 1030, the application may assign a color to the content according to the input.
The operations included in process 1000 are for illustration purposes. Managing content color through context based color menus according to embodiments may be implemented by similar processes with fewer or additional steps, as well as in different order of operations using the principles described herein.
The above specification, examples and data provide a complete description of the manufacture and use of the composition of the embodiments. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims and embodiments.